First steps into Bayesian optimization

Ivo Couckuyt, Joachim van der Herten

Introduction

Bayesian optimization is particularly useful for expensive optimization problems. This includes optimization problems where the objective (and constraints) are time-consuming to evaluate: measurements, engineering simulations, hyperparameter optimization of deep learning models, etc. Another area where Bayesian optimization may provide a benefit is in the presence of (a lot of) noise. If your problem does not satisfy these requirements other optimization algorithms might be better suited.

To setup a Bayesian optimization scheme with GPflowOpt you have to:

  • define your objective and specify the optimization domain
  • setup a GPflow model and choose an acquisition function
  • create a BayesianOptimizer

Objective function

In [1]:
import numpy as np
from gpflowopt.domain import ContinuousParameter


def fx(X):
    X = np.atleast_2d(X)
    return np.sum(np.square(X), axis=1)[:, None]

domain = ContinuousParameter('x1', -2, 2) + ContinuousParameter('x2', -1, 2)
domain
Out[1]:
NameTypeValues
x1Continuous[-2. 2.]
x2Continuous[-1. 2.]

Bayesian optimizer

In [2]:
import gpflow
from gpflowopt.bo import BayesianOptimizer
from gpflowopt.design import LatinHyperCube
from gpflowopt.acquisition import ExpectedImprovement
from gpflowopt.optim import SciPyOptimizer

# Use standard Gaussian process Regression
lhd = LatinHyperCube(21, domain)
X = lhd.generate()
Y = fx(X)
model = gpflow.gpr.GPR(X, Y, gpflow.kernels.Matern52(2, ARD=True))
model.kern.lengthscales.transform = gpflow.transforms.Log1pe(1e-3)

# Now create the Bayesian Optimizer
alpha = ExpectedImprovement(model)
optimizer = BayesianOptimizer(domain, alpha)

# Run the Bayesian optimization
with optimizer.silent():
    r = optimizer.optimize(fx, n_iter=15)
print(r)
Warning: optimization restart 1/5 failed
Warning: optimization restart 2/5 failed
Warning: optimization restart 3/5 failed
Warning: optimization restart 2/5 failed
     fun: array([ 0.01])
 message: 'OK'
    nfev: 15
 success: True
       x: array([[ 0. , -0.1]])

That’s all! Your objective function has now been optimized for 15 iterations.