Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

15/08/2023, 15:04 The Overlooked Limitations of Grid Search and Random Search

The Overlooked Limitations of Grid


Search and Random Search
...and what to try instead with.
AVI CHAWLA
11 DE AGO. DE 2023

23 Share

Hyperparameter tuning is a tedious task in training ML models.

Typically, we use two common approaches for this:

Grid search

Random search

Grid search and random search

But they have many limitations.

For instance:

https://www.blog.dailydoseofds.com/p/the-overlooked-limitations-of-grid?utm_source=substack&utm_medium=email 1/6
15/08/2023, 15:04 The Overlooked Limitations of Grid Search and Random Search

Grid search performs an exhaustive search over all combinations. This is


computationally expensive.

Grid search and random search are restricted to the specified


hyperparameter range. Yet, the ideal hyperparameter may exist outside that
range.

Limited search space with grid search and random search

They can ONLY perform discrete searches, even if the hyperparameter is


continuous.

https://www.blog.dailydoseofds.com/p/the-overlooked-limitations-of-grid?utm_source=substack&utm_medium=email 2/6
15/08/2023, 15:04 The Overlooked Limitations of Grid Search and Random Search

Grid search and random search can only try discrete values for
continuous hyperparameters

To this end, Bayesian Optimization is a highly underappreciated yet immensely


powerful approach for tuning hyperparameters.

It uses Bayesian statistics to estimate the distribution of the best


hyperparameters.

This allows it to take informed steps to select the next set of hyperparameters.
As a result, it gradually converges to an optimal set of hyperparameters much
faster.

The efficacy is evident from the image below.

Bayesian optimization leads the model to the same F1 score but:

https://www.blog.dailydoseofds.com/p/the-overlooked-limitations-of-grid?utm_source=substack&utm_medium=email 3/6
15/08/2023, 15:04 The Overlooked Limitations of Grid Search and Random Search

it takes 7x fewer iterations

it executes 5x faster

it reaches the optimal configuration earlier

But how does it exactly work, and why is it so effective?

What is the core intuition behind Bayesian optimization?

How does it optimally reduce the search space of the hyperparameters?

If you are curious, then this is precisely what we are learning in today’s
extensive machine learning deep dive.

Bayesian Optimization Article

The idea behind Bayesian optimization appeared to be extremely compelling to


me when I first learned it a few years back.

Learning about this optimized hyperparameter tuning and utilizing them has
been extremely helpful to me in building large ML models quickly.

Thus, learning about Bayesian optimization will be immensely valuable if you


envision doing the same.

Thus, today’s article covers:

Issues with traditional hyperparameter tuning approaches.

What is the motivation for Bayesian optimization?

https://www.blog.dailydoseofds.com/p/the-overlooked-limitations-of-grid?utm_source=substack&utm_medium=email 4/6
15/08/2023, 15:04 The Overlooked Limitations of Grid Search and Random Search

How does Bayesian optimization work?

The intuition behind Bayesian optimization.

Results from the research paper that proposed Bayesian optimization for
hyperparameter tuning.

A hands-on Bayesian optimization experiment.

Comparing Bayesian optimization with grid search and random search.

Analyzing the results of Bayesian optimization.

Best practices for using Bayesian optimization.

👉 Interested folks can read it here: Bayesian Optimization for


Hyperparameter Tuning.

Hope you will learn something new today :)

👉 If you liked this post, don’t forget to leave a like ❤️. It helps more people
discover this newsletter on Substack and tells me that you appreciate reading
these daily insights. The button is located towards the bottom of this email.

Thanks for reading!

23 Likes

Comments

Write a comment...

https://www.blog.dailydoseofds.com/p/the-overlooked-limitations-of-grid?utm_source=substack&utm_medium=email 5/6
15/08/2023, 15:04 The Overlooked Limitations of Grid Search and Random Search

© 2023 Avi Chawla ∙ Privacy ∙ Terms ∙ Collection notice


Substack is the home for great writing

https://www.blog.dailydoseofds.com/p/the-overlooked-limitations-of-grid?utm_source=substack&utm_medium=email 6/6

You might also like