Abstract
The initialization of neural networks is of significant importance for their performance. Currently, the prevalent initialization method is a random sample based on the network’s structure. This work presents a general yet effective method to initialize neural networks. We provide repeated experiments of training variants of Mobile-Net over down-sampled variants of Image-Net, demonstrating accuracy gain and loss decrease across most of the test sets and validation sets. E.g., for Mobile-Net (v1) and Image-Net (32 × 32), we had 2.5% accuracy improvement over both the test and validation sets.
| Original language | English |
|---|---|
| Title of host publication | Proceedings - 2025 11th International Conference on Computing and Artificial Intelligence, ICCAI 2025 |
| Publisher | Institute of Electrical and Electronics Engineers Inc. |
| Pages | 250-255 |
| Number of pages | 6 |
| ISBN (Electronic) | 9798331524913 |
| DOIs | |
| State | Published - 2025 |
| Event | 11th International Conference on Computing and Artificial Intelligence, ICCAI 2025 - Kyoto, Japan Duration: 28 Mar 2025 → 31 Mar 2025 |
Publication series
| Name | Proceedings - 2025 11th International Conference on Computing and Artificial Intelligence, ICCAI 2025 |
|---|
Conference
| Conference | 11th International Conference on Computing and Artificial Intelligence, ICCAI 2025 |
|---|---|
| Country/Territory | Japan |
| City | Kyoto |
| Period | 28/03/25 → 31/03/25 |
Bibliographical note
Publisher Copyright:©2025 IEEE.
Keywords
- Neural networks
- Neural networks initialization
- Non-convex optimization
- Optimization
ASJC Scopus subject areas
- Computer Science Applications
- Control and Systems Engineering
- Computer Graphics and Computer-Aided Design
- Artificial Intelligence