AutoML.org

Freiburg-Hannover-Tübingen

BOHB

Modern deep learning methods are very sensitive to many hyperparameters, and, due to the long training times of state-of-the-art models, vanilla Bayesian hyperparameter optimization is typically computationally infeasible. On the other hand, bandit-based configuration evaluation approaches based on random search lack guidance and do not converge to the best configurations as quickly. We propose to combine the benefits of both Bayesian optimization and bandit-based methods, in order to achieve the best of both worlds: strong anytime performance and fast convergence to optimal configurations. Our method is robust and versatile, while at the same time being conceptually simple and easy to implement.

References

  • Falkner, Stefan and Klein, Aaron and Hutter, Frank (published)(pdf)(supplementary)(poster)(bib)
    BOHB: Robust and Efficient Hyperparameter Optimization at Scale
    In: Proceedings of the 35th International Conference on Machine Learning (ICML 2018)