Definition:
Gradient Descent /ˈɡreɪ.di.ənt dɪˈsɛnt/ noun — In machine learning and mathematical optimization, gradient descent is an algorithm used to minimize the loss function of a model by iteratively adjusting its parameters in the direction that reduces error most rapidly.
The algorithm computes the gradient (partial derivatives) of the loss function with respect to model parameters and updates them by taking small steps in the opposite direction of the gradient, hence “descent.” This process continues until the model converges at a local or global minimum, ideally resulting in optimal performance.
There are different variants of gradient descent, including:
- Batch Gradient Descent – uses the entire dataset
- Stochastic Gradient Descent (SGD) – uses one data point at a time
- Mini-batch Gradient Descent – uses small batches of data
Gradient descent is foundational in training neural networks, logistic regression, and other supervised learning models, allowing them to learn from data by reducing prediction error over time.
« Back to dictionary

