Glorot
Glorot is a surname that has become notable in the field of machine learning through the work of Xavier Glorot. He is best known for co-authoring a 2010 paper with Yoshua Bengio that introduced a weight initialization scheme now commonly referred to as Glorot initialization or Xavier initialization. The method addresses the difficulty of training deep neural networks by setting initial weights to preserve signal variance across layers.
Glorot initialization defines weight distributions based on the number of incoming and outgoing connections of a
- Glorot uniform: W is drawn from a uniform distribution with limits ±limit, where limit = sqrt(6 / (fan_in
- Glorot normal: W is drawn from a normal distribution with zero mean and standard deviation std,
Usage and impact: Glorot initialization has become a standard default in many deep learning frameworks because