Definition:
Tuning /ˈtjuː.nɪŋ/ noun — In machine learning, tuning refers to the process of adjusting a model’s hyperparameters or configuration settings in order to improve its performance on a given task. Unlike model training, which learns internal parameters from data, tuning involves manually or algorithmically selecting optimal external settings before or during training.
Common hyperparameters that require tuning include:
- Learning rate
- Number of layers or neurons
- Batch size
- Regularization strength
- Dropout rate
Tuning techniques include:
- Grid Search – exhaustively searches combinations of hyperparameters
- Random Search – samples combinations randomly
- Bayesian Optimization – models performance over the hyperparameter space to guide selection
- Automated Machine Learning (AutoML) – performs tuning autonomously
Tuning is essential for:
- Improving model accuracy and generalization
- Reducing overfitting or underfitting
- Enhancing training efficiency
Effective tuning plays a critical role in deploying robust and high-performing AI systems across diverse applications, from natural language processing to computer vision and forecasting.
« Back to dictionary

