Here we will give an overview of commonly used and well-known Hyperparameter Optimization (HPO) packages (where only a few of them were developed by us). It is by no means complete. If you miss a package on the list, please let us know.
Bayesian Optimization (BO) is considered to be a state-of-the-art approach for expensive black-box functions and thus has been widely implemented in different HPO packages.
Spearmint was one of the first successful open source Bayesian Optimization packages for HPO.
SMAC3 is well known for optimizing complex and structured spaces (e.g., for optimizing entire ML pipelines) based on RF-surrogate models.
Hyperopt is a distributed HPO package implemented with a Tree of Parzen Estimators (TPE).
Bayesian Optimization is a (constrained) global optimization package.
HyperMapper is a BO package that supports working with unknown constraints and multi-objective optimization.
OpenBox is a general framework for black-box optimization, incl. HPO, and supports multi-objective optimization, multi-fidelity, early-stopping, transfer learning and parallel BO.
Dragonfly is an open source python library for scalable BO with multi-fidelity and multi-objective optimization.
TurBO is a scalable BO package for large scale paralleled HPO problems with many function evaluations.
Evolutionary algorithms (EA) are another popular black-box optimization framework, as EA can easily be parallelized and the time in each iteration does not grow as number of evaluations grow.
DEHB combines multi-fidelity optimization, such as in Hyperband, with differential evaluations.
DEAP is a novel evolutionary computation framework with lots of popular EA implementations
Nevergrad is a gradient-free optimization platform. Nevergrad implements different EA that meets the requirements of different problems.
General HPO packages
Here we will introduce several packages that implement both BO and EA approaches
Optuna is an automatic HPO framework that allows to dynamically construct the search space.
Oríon is an asynchronous framework for black-box function optimization.
Ray Tune is a scalable framework HPO framework that allows to employ several aforementioned framework as optimizers.