Shrinkage Initialization for Smooth Learning of Neural Networks
By: Miao Cheng , Feiyan Zhou , Hongwei Zou and more
Potential Business Impact:
Makes computer learning faster and more reliable.
The successes of intelligent systems have quite relied on the artificial learning of information, which lead to the broad applications of neural learning solutions. As a common sense, the training of neural networks can be largely improved by specifically defined initialization, neuron layers as well as the activation functions. Though there are sequential layer based initialization available, the generalized solution to initial stages is still desired. In this work, an improved approach to initialization of neural learning is presented, which adopts the shrinkage approach to initialize the transformation of each layer of networks. It can be universally adapted for the structures of any networks with random layers, while stable performance can be attained. Furthermore, the smooth learning of networks is adopted in this work, due to the diverse influence on neural learning. Experimental results on several artificial data sets demonstrate that, the proposed method is able to present robust results with the shrinkage initialization, and competent for smooth learning of neural networks.
Similar Papers
Understanding Two-Layer Neural Networks with Smooth Activation Functions
Machine Learning (CS)
Unlocks how computer brains learn to solve problems.
Sinusoidal Initialization, Time for a New Start
Machine Learning (CS)
Makes computer brains learn much faster and better.
Adaptive Width Neural Networks
Machine Learning (CS)
Makes computer brains learn better and smaller.