AC Hands-on Tutorial (AAAI’16)

Algorithms with free parameters abound in AI and beyond, and the setting of these parameters often makes the difference between failing to solve a problem and achieving state-of-the-art performance. Since manual parameter tuning is tedious and time-consuming, in recent years, general methods for solving this algorithm con figuration problem have been developed in the AI literature, based on various principles from machine learning, statistics, and optimization. By now there exist many success stories of algorithm con guration, for both hard combinatorial problems (for example, SAT solving, AI planning, mixed integer programming, time tabling, etc) and supervised machine learning (for example, Auto-WEKA, auto-sklearn, and parameter tuning in deep learning).

In this hands-on tutorial, we will demonstrate how to effectively use algorithm con guration in practice. Attendees do not require any specialized knowledge and will walk away with hands-on experience in con figuring various algorithms that will allow them to apply algorithm con guration in their respective fields of research.

This tutorial will have several hands-on sessions, in which participants will edit and run code on their own laptops. Please make sure your laptop is charged before the tutorial, as the availability of power outlets will be limited.

Schedule 

Saturday, February 13, 2:00 PM – 6:00 PM

  • The Algorithm Configuration Problem [30 minutes]
    • Motivation: examples for free parameters in AI algorithms
    • Motivation: examples of successful applications of algorithm configuration in AI
    • Hands-On: using auto-sklearn for a simple classification problem
    • Formal problem statement of algorithm configuration
    • Relationship to algorithm selection
    • Relationship to blackbox optimization
  • Overview of Methods for Algorithm Configuration [20 minutes]
    • Building blocks: search strategy, intensification strategy, adaptive capping strategy
    • Racing methods: F-Race and Iterative F-Race
    • Local search and evolutionary methods: ParamILS and GGA
    • Model-based methods: SMAC and p-SMAC
  • SpySMAC: a simple Python tool for using algorithm configuration [40 minutes]
    • Configuration of a new algorithm; Hands-on: configuration of a SAT solver
    • Specifying parameters;
    • Choosing a good instance set to configure on
    • Choosing configuration budget and runtime cutoff
    • Interpreting results
  • Break [30 minutes]
  • SMAC and pySMAC: two interfaces for using algorithm configuration [25 minutes]
    • pySMAC: Using the Python interface of SMAC
    • Hands-on: Optimization on the Rosenbrock function
    • Using the flexible interface of SMAC
    • Hands-on: Configuration MiniSAT on software verification instances
    • Hands-on: Including instance features (to improve algorithm configuration results)
    • Hands-on: Configuration of a planner
  • Determining the importance of parameters [40 minutes]
    • Functional ANOVA: global influence of parameters on performance
    • Forward selection: determining parameters that suffice to predict performance well
    • Ablation: local influence of parameters on performance on a path from default to optimized configuration
    • Hands-on: analyzing parameter importance returned by SpySMAC (continued from above, but now based on data from longer algorithm runs we provide)
  • Pitfalls and best practices [35 minutes]
    • The overtuning problem; Hands-on: over-tuning to a small set of problem instances
    • Common mistakes when writing algorithm wrappers
    • Tips and tricks to get better results
  • Advanced Topics and Conclusion [20 minutes]
    • Combinations with algorithm selection
    • Open research questions
  • Summary and conclusions

Resources

Tool Links

Links

  • www.ml4aad.org – Overview of all our tools including the ones used in the tutorial
  • www.aclib.net – Algorithm Configuration Library
  • www.coseal.net – COnfiguration and SElection of ALgorithms (COSEAL) group

Contact

Frank Hutter and Marius Lindauer