grid-line

Vanishing gradient problem

The Vanishing Gradient Problem is an issue encountered during the training of deep neural networks where gradients become exceedingly small. This phenomenon results in slow or stalled learning, making it difficult for the network to update its weights effectively. The problem primarily affects deep networks and can be mitigated by techniques such as using ReLU activation functions, proper weight initialization, batch normalization, and gradient clipping.
14.8K
Volume
+148%
Growth
regular