The Impact of Neural Network Initialization — Phase Diagram Analysis, Parameter Condensation, and Loss Landscape Embedding
Date:
How does the size of neural network weight initialization affect the behavior of neural networks? In this session, we will share the research results of Professor Zhiqin Xu’s team in this area. “Phase diagram analysis” shows the dynamic characteristics of neural networks under different conditions during training; “Condensation phenomenon” discusses the aggregation effect of neuron weights during the training process, which helps improve generalization capabilities; “Loss landscape embedding principle” explains how wide networks can contain critical points of narrow networks, guiding the model to converge to solutions of low complexity. These contents together deepen the understanding of neural network training and generalization mechanisms.
