Expert-in-the-loop HPO

HPO in practice can be made more efficient if the search is guided through often available expert’s domain knowledge and intuition. Expert priors can be interfaced with the HPO problem in multiple ways such as leveraging past evaluations of different hyperparameter settings or the expert’s explicit belief of good hyperparameters encoded as a prior distribution over the search space. We list certain works and packages that present methods to do so.

Our Packages

  • NePS allows HPO for expensive machine-learning pipelines with algorithms that are able to leverage expert prior input.

Our Works

  • Hyperparameter Transfer Across Developer Adjustments shows how previously found good hyperparameter settings can be used for improved HPO efficiency.
  • πBO presents a simple method to use an expert prior belief with BO.
  • PriorBand allows an expert prior interface to Hyperband and other multi-fidelity algorithms including BO extensions.