Back to the Index

Hyperparameter Tuning

What is hyperparameter tuning in machine learning? 

Hyperparameter tuning involves training multiple models each with different hyperparameter values to find good values for hyperparameters that optimize model performance

Hyperparameter exploration

There are several methods for hyperparameter tuning, including grid search, random search, Bayesian optimization, and Tree Parzen Estimators (TPE). They involve running trials that explore the performance of different combinations of hyperparameters by training models with the hyperparameters selected by the method. Bayesian optimization and TPE use a model that is updated after each trial to guide the selection of hyperparameters for each trial.

Training data, hyperparameter tuning, and avoiding leakage into model training

Typically, you will have a fixed training dataset when evaluating different combinations of hyperparameter values. You split your training dataset into 3 sets: train, validation, and test sets. You evaluate your combination of hyperparameters by training a model with those hyperparameters on the train set, and then evaluate the trained model performance on the validation set. You never use the test set to evaluate hyperparameters, as otherwise you could leak the test set into your model training.

When you have finished hyperparameter tuning experiments, you select the best hyperparameters, train your model on the combined training and validation sets, and evaluate the model’s performance using the holdout test set. 

Is it important to tune hyperparameters?

Tuning hyperparameters is essential because it can significantly impact the accuracy and performance of a model. Choosing the right combination of hyperparameters can improve the model's generalization ability and prevent overfitting.

How can I perform some hyperparameter tuning?

Hyperparameter tuning can be performed manually by testing different combinations of hyperparameters and evaluating their performance. 

However, this can be time-consuming and impractical for larger models. Automated hyperparameter tuning techniques such as grid search, random search, and Bayesian optimization can be used to efficiently explore the hyperparameter space and find the optimal combination of hyperparameters for a given model.

Does this content look outdated? If you are interested in helping us maintain this, feel free to contact us.

© Hopsworks 2023. All rights reserved. Various trademarks held by their respective owners.

Privacy Policy
Cookie Policy
Terms and Conditions