SMAC (sequential model-based algorithm configuration) is a versatile tool for optimizing algorithm parameters (or the parameters of some other process we can run automatically, or a function we can evaluate, such as a simulation).

SMAC has helped us speed up both local search and tree search algorithms by orders of magnitude on certain instance distributions. Recently, we have also found it to be very effective for the hyperparameter optimization of machine learning algorithms, scaling better to high dimensions and discrete input dimensions than other algorithms. Finally, the predictive models SMAC is based on can also capture and exploit important information about the model domain, such as which input variables are most important. We hope you find SMAC similarly useful. Ultimately, we hope that it  helps algorithm designers focus on tasks that are more scientifically valuable than parameter tuning. 


  • Falkner, S. and Lindauer, M. and Hutter, F.
    SpySMAC: Automated Configuration and Performance Analysis of SAT Solvers
    In: Proceedings of the International Conference on Satisfiability Solving (SAT’15)
  • Frank Hutter, Holger Hoos, and Kevin Leyton-Brown.
    An evaluation of sequential model-based optimization for expensive blackbox functions
    In GECCO 2013 Blackbox Optimization Benchmarking workshop (BBOB’13).
  • Frank Hutter, Holger Hoos, and Kevin Leyton-Brown.
    Parallel Algorithm Configuration
    In: Learning and Intelligent Optimization (LION 6)
  • Frank Hutter, Holger Hoos, and Kevin Leyton-Brown.
    Bayesian Optimization With Censored Response Data
    2011 NIPS workshop on Bayesian Optimization, Experimental Design, and Bandits.
  • Frank Hutter, Holger Hoos, and Kevin Leyton-Brown.
    Sequential Model-Based Optimization for General Algorithm Configuration
    In LION-5, 2011.  Second best paper prize
  • Frank Hutter, Holger Hoos, Kevin Leyton-Brown, and Kevin Murphy.
    Time-Bounded Sequential Parameter Optimization
    In LION4, 2010