Hyperparameter optimization and neural architecture search can become prohibitively expensive for regular black-box Bayesian optimization because the training and evaluation of a single model can easily take several hours. To overcome this, we introduce a comprehensive tool suite for effective multi-fidelity Bayesian optimization and the analysis of its runs. The suite, written in Python, provides a simple way to specify complex design spaces, a robust and efficient combination of Bayesian optimization and HyperBand, and a comprehensive analysis of the optimization process and its outcomes.
- https://github.com/automl/BOAH: Examples and Jupyter notebooks
- https://github.com/automl/ConfigSpace: Configuration space to define search space
- https://github.com/automl/HpBandSter: implementations of BOHB, Hyperband, successive halving and random search
- https://github.com/automl/CAVE: analysis and visualization of HPO runs
- Exemplary CAVE Report (30MB)