Loading…
This event has ended. Visit the official site or create your own event on Sched.
Click here to return to main conference site. For a one page, printable overview of the schedule, see this.
Tuesday, June 28 • 4:45pm - 5:03pm
mlrMBO: A Toolbox for Model-Based Optimization of Expensive Black-Box Functions

Log in to save this to your schedule, view media, leave feedback and see who's attending!

Many practical optimization tasks, such as finding best parameters for simulators in engineering or hyperparameter optimization in machine learning, are of a black-box nature, i.e., neither formulas of the objective nor derivative information is available. Instead, we can only query the box for its objective value at a given point. If such a query is very time-consuming, the optimization task becomes extremely challenging, as we have to operate under a severely constrained budget of function evaluations. A modern approach is sequential model based-optimization, aka Bayesian optimization. Here, a surrogate regression model learns the relationship between decision variables and objective outcome. Sequential point evaluations are planned to simultaneously exploit the so far learnt functional landscape and to ensure exploration of the search space. A popular instance of this general principle is the EGO algorithm, which uses Gaussian processes coupled with the expected improvement criterion for point proposal. The mlrMBO package offers a rich interface to many variants of model-based optimization. As it builds upon the mlr package for machine learning in R, arbitrary surrogate regression models can be applied. It offers a wide variety of options to tackle different black-box scenarios: - Optimization of pure continuous as well as mixed continuous-categorical search spaces.- Single criteria optimization or approximated Pareto fronts for multi-criteria problems. - Single point proposal or parallel batch point planning during optimization. The package is designed as a convenient, easy-to-use toolbox of popular state-of-the-art algorithms, but can also be used as as a research framework for algorithm designers.

Moderators
avatar for Trevor Hastie

Speakers
avatar for Jakob  Richter

Jakob Richter

TU Dortmund University
- Interested in Machine Learning and Hyperparameter Tuning/Optimization. - Especially through Model Based Optimization using Kriging or Random Forest. - Parallel Computation


Tuesday June 28, 2016 4:45pm - 5:03pm PDT
SIEPR 130