grid-line

Gradient descent

It's an optimization algorithm often used in machine learning and data science to minimize a function. By iteratively moving in the direction of steepest descent, as defined by the negative of the gradient, it finds the local minimum of a differentiable function. It's particularly useful for when the number of features (and thus the complexity of the function) is very large.
60.5K
Volume
+32%
Growth
exploding