grid-line

Vanishing gradient problem

This refers to a difficulty encountered during the training of deep neural networks, when the gradient used for backpropagation becomes extremely small, almost disappearing. This issue, known as the vanishing gradient problem, results in minimal changes to the weights and biases of the earlier layers, hindering the network's ability to learn effectively. It is particularly noticeable in deep networks with numerous layers.
12.1K
Volume
+144%
Growth
exploding