This document discusses common problems that can occur in hyperparameter optimization including trusting default values too much, using the wrong evaluation metric, overfitting hyperparameters to validation data, optimizing too few hyperparameters, relying too heavily on manual tuning, using grid search which scales poorly, using random search which has high variance, and provides recommendations such as using Bayesian optimization which can efficiently search large hyperparameter spaces and intelligently determine the most promising configurations to evaluate next.
6. ● Default values are an implicit choice
● Defaults not always appropriate for your model
● You may build a classifier that looks like this:
Default Values
30. Intro
Ian Dewancker. SigOpt for ML: TensorFlow ConvNets on a Budget with Bayesian Optimization.
Ian Dewancker. SigOpt for ML: Unsupervised Learning with Even Less Supervision Using Bayesian Optimization.
Ian Dewancker. SigOpt for ML : Bayesian Optimization for Collaborative Filtering with MLlib.
#1 Trusting the Defaults
Keras recurrent layers documentation
#2 Using the Wrong Metric
Ron Kohavi et al. Trustworthy Online Controlled Experiments: Five Puzzling Outcomes Explained.
Xavier Amatriain. 10 Lessons Learning from building ML systems [Video at 19:03].
Image from PhD Comics.
See also: SigOpt in Depth: Intro to Multicriteria Optimization.
#4 Too Few Hyperparameters
Image from TensorFlow Playground.
Ian Dewancker. SigOpt for ML: Unsupervised Learning with Even Less Supervision Using Bayesian Optimization.
#5 Hand Tuning
On algorithms beating experts: Scott Clark, Ian Dewancker, and Sathish Nagappan. Deep Neural Network Optimization with SigOpt and Nervana
Cloud.
#6 Grid Search
NoGridSearch.com
References - by Section
31. References - by Section
#7 Random Search
James Bergstra and Yoshua Bengio. Random search for hyper-parameter optimization.
Ian Dewancker, Michael McCourt, Scott Clark, Patrick Hayes, Alexandra Johnson, George Ke. A Stratified Analysis of Bayesian Optimization
Methods.
Learn More
blog.sigopt.com
sigopt.com/research