# Bayesian Optimizer¶

class gpflowopt.BayesianOptimizer(domain, acquisition, optimizer=None, initial=None, scaling=True, hyper_draws=None, callback=<function jitchol_callback>, verbose=False)

A traditional Bayesian optimization framework implementation.

Like other optimizers, this optimizer is constructed for optimization over a domain. Additionally, it is configured with a separate optimizer for the acquisition function.

Attributes: domain The current domain the optimizer operates on.

Methods

 failsafe() Context to provide a safe way for optimization. get_initial() Return the initial set of points. gradient_enabled() Returns if the optimizer is a gradient-based algorithm or not. optimize(objectivefx[, n_iter]) Run Bayesian optimization for a number of iterations. set_initial(initial) Set the initial set of points. silent() Context for performing actions on an optimizer (such as optimize) with all stdout discarded.
__init__(domain, acquisition, optimizer=None, initial=None, scaling=True, hyper_draws=None, callback=<function jitchol_callback>, verbose=False)
Parameters: domain (Domain) – The optimization space. acquisition (Acquisition) – The acquisition function to optimize over the domain. optimizer (Optimizer) – (optional) optimization approach for the acquisition function. If not specified, SciPyOptimizer is used. This optimizer will run on the same domain as the BayesianOptimizer object. initial (Design) – (optional) The initial design of candidates to evaluate before the optimization loop runs. Note that if the underlying model contains already some data from an initial design, it is augmented with the evaluations obtained by evaluating the points as specified by the design. scaling (bool) – (boolean, default true) if set to true, the outputs are normalized, and the inputs are scaled to a unit cube. This only affects model training: calls to acquisition.data, as well as returned optima are unscaled (see DataScaler for more details.). Note, the models contained by acquisition are modified directly, and so the references to the model outside of BayesianOptimizer now point to scaled models. hyper_draws (int) – (optional) Enable marginalization of model hyperparameters. By default, point estimates are used. If this parameter set to n, n hyperparameter draws from the likelihood distribution are obtained using Hamiltonian MC. (see GPflow documentation for details) for each model. The acquisition score is computed for each draw, and averaged. callback (callable) – (optional) this function or object will be called, after the data of all models has been updated with all models as retrieved by acquisition.models as argument without the wrapping model handling any scaling . This allows custom model optimization strategies to be implemented. All manipulations of GPflow models are permitted. Combined with the optimize_restarts parameter of Acquisition this allows several scenarios: do the optimization manually from the callback (optimize_restarts equals 0), or choose the starting point + some random restarts (optimize_restarts > 0).
domain

The current domain the optimizer operates on.

Returns: :class:’~.domain.Domain object
failsafe()

Context to provide a safe way for optimization.

If a RuntimeError is generated, the data of the acquisition object is saved to the disk. in the current directory. This allows the data to be re-used (which makes sense for expensive data).

The data can be used to experiment with fitting a GPflow model first (analyse the data, set sensible initial hyperparameter values and hyperpriors) before retrying Bayesian Optimization again.

optimize(objectivefx, n_iter=20)

Run Bayesian optimization for a number of iterations.

Before the loop is initiated, first all points retrieved by get_initial() are evaluated on the objective and black-box constraints. These points are then added to the acquisition function by calling set_data() (and hence, the underlying models).

Each iteration a new data point is selected for evaluation by optimizing an acquisition function. This point updates the models.

Parameters: objectivefx – (list of) expensive black-box objective and constraint functions. For evaluation, the responses of all the expensive functions are aggregated column wise. Unlike the typical Optimizer` interface, these functions should not return gradients. n_iter – number of iterations to run OptimizeResult object