Acquisition functions¶
The gpflowopt package currently supports a limited number of popular acquisition functions. These are summarized in the table below. Detailed description for each can be found below.
Method | Objective | Constraint | # Outputs |
---|---|---|---|
gpflowopt.acquisition.ExpectedImprovement |
✔ | 1 | |
gpflowopt.acquisition.ProbabilityOfFeasibility |
✔ | 1 | |
gpflowopt.acquisition.ProbabilityOfImprovement |
✔ | 1 | |
gpflowopt.acquisition.LowerConfidenceBound |
✔ | 1 | |
gpflowopt.acquisition.MinValueEntropySearch |
✔ | 1 | |
gpflowopt.acquisition.HVProbabilityOfImprovement |
✔ | > 1 |
Single-objective¶
Expected Improvement¶
-
class
gpflowopt.acquisition.
ExpectedImprovement
(model)¶ Expected Improvement acquisition function for single-objective global optimization. Introduced by (Mockus et al, 1975).
Key reference:
@article{Jones:1998, title={Efficient global optimization of expensive black-box functions}, author={Jones, Donald R and Schonlau, Matthias and Welch, William J}, journal={Journal of Global optimization}, volume={13}, number={4}, pages={455--492}, year={1998}, publisher={Springer} }
This acquisition function is the expectation of the improvement over the current best observation w.r.t. the predictive distribution. The definition is closely related to the
ProbabilityOfImprovement
, but adds a multiplication with the improvement w.r.t the current best observation to the integral.\[\alpha(\mathbf x_{\star}) = \int \max(f_{\min} - f_{\star}, 0) \, p( f_{\star}\,|\, \mathbf x, \mathbf y, \mathbf x_{\star} ) \, d f_{\star}\]Attributes: data
The training data of the models.
data_holders
Return a list of all the child DataHolders
fixed
A boolean attribute to determine if all the child parameters of this node are fixed
highest_parent
A reference to the top of the tree, usually a Model instance
long_name
This is a unique identifier for a param object within a structure, made by concatenating the names through the tree.
models
The GPflow models representing our beliefs of the optimization problem.
name
An automatically generated name, given by the reference of the _parent to this instance.
sorted_params
Return a list of all the child parameters, sorted by id.
Methods
build_prior
()Build a tf expression for the prior by summing all child-parameter priors. constraint_indices
()Method returning the indices of the model outputs which correspond to the (expensive) constraint functions. enable_scaling
(domain)Enables and configures the DataScaler
objects wrapping the GP models.evaluate
(Xcand)AutoFlow method to compute the acquisition scores for candidates, without returning the gradients. evaluate_with_gradients
(Xcand)AutoFlow method to compute the acquisition scores for candidates, also returns the gradients. feasible_data_index
()Returns a boolean array indicating which data points are considered feasible (according to the acquisition function(s) ) and which not. get_feed_dict_keys
()Recursively generate a dictionary of {object: _tf_array} pairs that can be used in update_feed_dict get_free_state
()Recurse get_free_state on all child parameters, and hstack them. get_param_index
(param_to_index)Given a parameter, compute the position of that parameter on the free-state vector. get_samples_df
(samples)Given a numpy array where each row is a valid free-state vector, return a pandas.DataFrame which contains the parameter name and associated samples in the correct form (e.g. make_tf_array
(X)Distribute a flat tensorflow array amongst all the child parameter of this instance. objective_indices
()Method returning the indices of the model outputs which are objective functions. randomize
([distributions, skipfixed])Calls randomize on all parameters in model hierarchy. set_data
(X, Y)Update the training data of the contained models set_state
(x)Set the values of all the parameters by recursion tf_mode
()A context for building models. build_acquisition get_parameter_dict set_parameter_dict update_feed_dict -
__init__
(model)¶ Parameters: model – GPflow model (single output) representing our belief of the objective
Probability of Feasibility¶
-
class
gpflowopt.acquisition.
ProbabilityOfFeasibility
(model, threshold=0.0, minimum_pof=0.5)¶ Probability of Feasibility acquisition function for sampling feasible regions. Standard acquisition function for Bayesian Optimization with black-box expensive constraints.
Key reference:
@article{Schonlau:1997, title={Computer experiments and global optimization}, author={Schonlau, Matthias}, year={1997}, publisher={University of Waterloo} }
The acquisition function measures the probability of the latent function being smaller than a threshold for a candidate point.
\[\alpha(\mathbf x_{\star}) = \int_{-\infty}^{0} \, p(f_{\star}\,|\, \mathbf x, \mathbf y, \mathbf x_{\star} ) \, d f_{\star}\]Attributes: data
The training data of the models.
data_holders
Return a list of all the child DataHolders
fixed
A boolean attribute to determine if all the child parameters of this node are fixed
highest_parent
A reference to the top of the tree, usually a Model instance
long_name
This is a unique identifier for a param object within a structure, made by concatenating the names through the tree.
models
The GPflow models representing our beliefs of the optimization problem.
name
An automatically generated name, given by the reference of the _parent to this instance.
sorted_params
Return a list of all the child parameters, sorted by id.
Methods
build_prior
()Build a tf expression for the prior by summing all child-parameter priors. constraint_indices
()Method returning the indices of the model outputs which correspond to the (expensive) constraint functions. enable_scaling
(domain)Enables and configures the DataScaler
objects wrapping the GP models.evaluate
(Xcand)AutoFlow method to compute the acquisition scores for candidates, without returning the gradients. evaluate_with_gradients
(Xcand)AutoFlow method to compute the acquisition scores for candidates, also returns the gradients. feasible_data_index
()Returns a boolean array indicating which points are feasible (True) and which are not (False). get_feed_dict_keys
()Recursively generate a dictionary of {object: _tf_array} pairs that can be used in update_feed_dict get_free_state
()Recurse get_free_state on all child parameters, and hstack them. get_param_index
(param_to_index)Given a parameter, compute the position of that parameter on the free-state vector. get_samples_df
(samples)Given a numpy array where each row is a valid free-state vector, return a pandas.DataFrame which contains the parameter name and associated samples in the correct form (e.g. make_tf_array
(X)Distribute a flat tensorflow array amongst all the child parameter of this instance. objective_indices
()Method returning the indices of the model outputs which are objective functions. randomize
([distributions, skipfixed])Calls randomize on all parameters in model hierarchy. set_data
(X, Y)Update the training data of the contained models set_state
(x)Set the values of all the parameters by recursion tf_mode
()A context for building models. build_acquisition get_parameter_dict set_parameter_dict update_feed_dict -
__init__
(model, threshold=0.0, minimum_pof=0.5)¶ Parameters: - model – GPflow model (single output) representing our belief of the constraint
- threshold – Observed values lower than the threshold are considered valid
- minimum_pof – minimum pof score required for a point to be valid. For more information, see docstring of feasible_data_index
-
constraint_indices
()¶ Method returning the indices of the model outputs which correspond to the (expensive) constraint functions. By default there are no constraint functions
-
feasible_data_index
()¶ Returns a boolean array indicating which points are feasible (True) and which are not (False).
Answering the question which points are feasible? is slightly troublesome in case noise is present. Directly relying on the noisy data and comparing it to self.threshold does not make much sense.
Instead, we rely on the model belief using the PoF (a probability between 0 and 1). As the implementation of the PoF corresponds to the cdf of the (normal) predictive distribution in a point evaluated at the threshold, requiring a minimum pof of 0.5 implies the mean of the predictive distribution is below the threshold, hence it is marked as feasible. A minimum pof of 0 marks all points valid. Setting it to 1 results in all invalid.
Returns: boolean ndarray (size N)
Probability of Improvement¶
-
class
gpflowopt.acquisition.
ProbabilityOfImprovement
(model)¶ Probability of Improvement acquisition function for single-objective global optimization.
Key reference:
@article{Kushner:1964, author = "Kushner, Harold J", journal = "Journal of Basic Engineering", number = "1", pages = "97--106", publisher = "American Society of Mechanical Engineers", title = "{A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise}", volume = "86", year = "1964" }
\[\alpha(\mathbf x_{\star}) = \int_{-\infty}^{f_{\min}} \, p( f_{\star}\,|\, \mathbf x, \mathbf y, \mathbf x_{\star} ) \, d f_{\star}\]Attributes: data
The training data of the models.
data_holders
Return a list of all the child DataHolders
fixed
A boolean attribute to determine if all the child parameters of this node are fixed
highest_parent
A reference to the top of the tree, usually a Model instance
long_name
This is a unique identifier for a param object within a structure, made by concatenating the names through the tree.
models
The GPflow models representing our beliefs of the optimization problem.
name
An automatically generated name, given by the reference of the _parent to this instance.
sorted_params
Return a list of all the child parameters, sorted by id.
Methods
build_prior
()Build a tf expression for the prior by summing all child-parameter priors. constraint_indices
()Method returning the indices of the model outputs which correspond to the (expensive) constraint functions. enable_scaling
(domain)Enables and configures the DataScaler
objects wrapping the GP models.evaluate
(Xcand)AutoFlow method to compute the acquisition scores for candidates, without returning the gradients. evaluate_with_gradients
(Xcand)AutoFlow method to compute the acquisition scores for candidates, also returns the gradients. feasible_data_index
()Returns a boolean array indicating which data points are considered feasible (according to the acquisition function(s) ) and which not. get_feed_dict_keys
()Recursively generate a dictionary of {object: _tf_array} pairs that can be used in update_feed_dict get_free_state
()Recurse get_free_state on all child parameters, and hstack them. get_param_index
(param_to_index)Given a parameter, compute the position of that parameter on the free-state vector. get_samples_df
(samples)Given a numpy array where each row is a valid free-state vector, return a pandas.DataFrame which contains the parameter name and associated samples in the correct form (e.g. make_tf_array
(X)Distribute a flat tensorflow array amongst all the child parameter of this instance. objective_indices
()Method returning the indices of the model outputs which are objective functions. randomize
([distributions, skipfixed])Calls randomize on all parameters in model hierarchy. set_data
(X, Y)Update the training data of the contained models set_state
(x)Set the values of all the parameters by recursion tf_mode
()A context for building models. build_acquisition get_parameter_dict set_parameter_dict update_feed_dict -
__init__
(model)¶ Parameters: model – GPflow model (single output) representing our belief of the objective
Lower Confidence Bound¶
-
class
gpflowopt.acquisition.
LowerConfidenceBound
(model, sigma=2.0)¶ Lower confidence bound acquisition function for single-objective global optimization.
Key reference:
@inproceedings{Srinivas:2010, author = "Srinivas, Niranjan and Krause, Andreas and Seeger, Matthias and Kakade, Sham M.", booktitle = "{Proceedings of the 27th International Conference on Machine Learning (ICML-10)}", editor = "F{"u}rnkranz, Johannes and Joachims, Thorsten", pages = "1015--1022", publisher = "Omnipress", title = "{Gaussian Process Optimization in the Bandit Setting: No Regret and Experimental Design}", year = "2010" }
\[\alpha(\mathbf x_{\star}) =\mathbb{E} \left[ f_{\star}\,|\, \mathbf x, \mathbf y, \mathbf x_{\star} \right] - \sigma \mbox{Var} \left[ f_{\star}\,|\, \mathbf x, \mathbf y, \mathbf x_{\star} \right]\]Attributes: data
The training data of the models.
data_holders
Return a list of all the child DataHolders
fixed
A boolean attribute to determine if all the child parameters of this node are fixed
highest_parent
A reference to the top of the tree, usually a Model instance
long_name
This is a unique identifier for a param object within a structure, made by concatenating the names through the tree.
models
The GPflow models representing our beliefs of the optimization problem.
name
An automatically generated name, given by the reference of the _parent to this instance.
sorted_params
Return a list of all the child parameters, sorted by id.
Methods
build_prior
()Build a tf expression for the prior by summing all child-parameter priors. constraint_indices
()Method returning the indices of the model outputs which correspond to the (expensive) constraint functions. enable_scaling
(domain)Enables and configures the DataScaler
objects wrapping the GP models.evaluate
(Xcand)AutoFlow method to compute the acquisition scores for candidates, without returning the gradients. evaluate_with_gradients
(Xcand)AutoFlow method to compute the acquisition scores for candidates, also returns the gradients. feasible_data_index
()Returns a boolean array indicating which data points are considered feasible (according to the acquisition function(s) ) and which not. get_feed_dict_keys
()Recursively generate a dictionary of {object: _tf_array} pairs that can be used in update_feed_dict get_free_state
()Recurse get_free_state on all child parameters, and hstack them. get_param_index
(param_to_index)Given a parameter, compute the position of that parameter on the free-state vector. get_samples_df
(samples)Given a numpy array where each row is a valid free-state vector, return a pandas.DataFrame which contains the parameter name and associated samples in the correct form (e.g. make_tf_array
(X)Distribute a flat tensorflow array amongst all the child parameter of this instance. objective_indices
()Method returning the indices of the model outputs which are objective functions. randomize
([distributions, skipfixed])Calls randomize on all parameters in model hierarchy. set_data
(X, Y)Update the training data of the contained models set_state
(x)Set the values of all the parameters by recursion tf_mode
()A context for building models. build_acquisition get_parameter_dict set_parameter_dict update_feed_dict -
__init__
(model, sigma=2.0)¶ Parameters: - model – GPflow model (single output) representing our belief of the objective
- sigma – See formula, the higher the more exploration
Min-Value Entropy Search¶
-
class
gpflowopt.acquisition.
MinValueEntropySearch
(model, domain, gridsize=10000, num_samples=10)¶ Max-value entropy search acquisition function for single-objective global optimization. Introduced by (Wang et al., 2017).
Key reference:
- ::
- @InProceedings{Wang:2017,
- title = {Max-value Entropy Search for Efficient {B}ayesian Optimization}, author = {Zi Wang and Stefanie Jegelka}, booktitle = {Proceedings of the 34th International Conference on Machine Learning}, pages = {3627–3635}, year = {2017}, editor = {Doina Precup and Yee Whye Teh}, volume = {70}, series = {Proceedings of Machine Learning Research}, address = {International Convention Centre, Sydney, Australia}, month = {06–11 Aug}, publisher = {PMLR},
}
Attributes: data
The training data of the models.
data_holders
Return a list of all the child DataHolders
fixed
A boolean attribute to determine if all the child parameters of this node are fixed
highest_parent
A reference to the top of the tree, usually a Model instance
long_name
This is a unique identifier for a param object within a structure, made by concatenating the names through the tree.
models
The GPflow models representing our beliefs of the optimization problem.
name
An automatically generated name, given by the reference of the _parent to this instance.
sorted_params
Return a list of all the child parameters, sorted by id.
Methods
build_prior
()Build a tf expression for the prior by summing all child-parameter priors. constraint_indices
()Method returning the indices of the model outputs which correspond to the (expensive) constraint functions. enable_scaling
(domain)Enables and configures the DataScaler
objects wrapping the GP models.evaluate
(Xcand)AutoFlow method to compute the acquisition scores for candidates, without returning the gradients. evaluate_with_gradients
(Xcand)AutoFlow method to compute the acquisition scores for candidates, also returns the gradients. feasible_data_index
()Returns a boolean array indicating which data points are considered feasible (according to the acquisition function(s) ) and which not. get_feed_dict_keys
()Recursively generate a dictionary of {object: _tf_array} pairs that can be used in update_feed_dict get_free_state
()Recurse get_free_state on all child parameters, and hstack them. get_param_index
(param_to_index)Given a parameter, compute the position of that parameter on the free-state vector. get_samples_df
(samples)Given a numpy array where each row is a valid free-state vector, return a pandas.DataFrame which contains the parameter name and associated samples in the correct form (e.g. make_tf_array
(X)Distribute a flat tensorflow array amongst all the child parameter of this instance. objective_indices
()Method returning the indices of the model outputs which are objective functions. randomize
([distributions, skipfixed])Calls randomize on all parameters in model hierarchy. set_data
(X, Y)Update the training data of the contained models set_state
(x)Set the values of all the parameters by recursion tf_mode
()A context for building models. build_acquisition get_parameter_dict set_parameter_dict update_feed_dict -
__init__
(model, domain, gridsize=10000, num_samples=10)¶ Parameters: - models – list of GPflow models representing our beliefs about the problem
- optimize_restarts – number of optimization restarts to use when training the models
Multi-objective¶
Hypervolume-based Probability of Improvement¶
-
class
gpflowopt.acquisition.
HVProbabilityOfImprovement
(models)¶ Hypervolume-based Probability of Improvement.
A multiobjective acquisition function for multiobjective optimization. It is used to identify a complete Pareto set of non-dominated solutions.
Key reference:
@article{Couckuyt:2014, title={Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization}, author={Couckuyt, Ivo and Deschrijver, Dirk and Dhaene, Tom}, journal={Journal of Global Optimization}, volume={60}, number={3}, pages={575--594}, year={2014}, publisher={Springer} }
For a Pareto set \(\mathcal{P}\), the non-dominated section of the objective space is denoted by \(A\). The
hypervolume()
of the dominated part of the space is denoted by \(\mathcal{H}\) and can be used as indicator for the optimality of the Pareto set (the higher the better).\[\begin{split}\boldsymbol{\mu} &= \left[ \mathbb{E} \left[ f^{(1)}_{\star}\,|\, \mathbf x, \mathbf y, \mathbf x_{\star} \right], ..., \mathbb{E} \left[ f^{(p)}_{\star}\,|\, \mathbf x, \mathbf y, \mathbf x_{\star} \right]\right] \\ I\left(\boldsymbol{\mu}, \mathcal{P}\right) &= \begin{cases} \left( \mathcal{H} \left( \mathcal{P} \cup \boldsymbol{\mu} \right) - \mathcal{H} \left( \mathcal{P} \right)) \right) ~ if ~ \boldsymbol{\mu} \in A \\ 0 ~ \mbox{otherwise} \end{cases} \\ \alpha(\mathbf x_{\star}) &= I\left(\boldsymbol{\mu}, \mathcal{P}\right) p\left(\mathbf x_{\star} \in A \right)\end{split}\]- Attributes:
- pareto: An instance of
Pareto
.
Attributes: data
The training data of the models.
data_holders
Return a list of all the child DataHolders
fixed
A boolean attribute to determine if all the child parameters of this node are fixed
highest_parent
A reference to the top of the tree, usually a Model instance
long_name
This is a unique identifier for a param object within a structure, made by concatenating the names through the tree.
models
The GPflow models representing our beliefs of the optimization problem.
name
An automatically generated name, given by the reference of the _parent to this instance.
sorted_params
Return a list of all the child parameters, sorted by id.
Methods
build_prior
()Build a tf expression for the prior by summing all child-parameter priors. constraint_indices
()Method returning the indices of the model outputs which correspond to the (expensive) constraint functions. enable_scaling
(domain)Enables and configures the DataScaler
objects wrapping the GP models.evaluate
(Xcand)AutoFlow method to compute the acquisition scores for candidates, without returning the gradients. evaluate_with_gradients
(Xcand)AutoFlow method to compute the acquisition scores for candidates, also returns the gradients. feasible_data_index
()Returns a boolean array indicating which data points are considered feasible (according to the acquisition function(s) ) and which not. get_feed_dict_keys
()Recursively generate a dictionary of {object: _tf_array} pairs that can be used in update_feed_dict get_free_state
()Recurse get_free_state on all child parameters, and hstack them. get_param_index
(param_to_index)Given a parameter, compute the position of that parameter on the free-state vector. get_samples_df
(samples)Given a numpy array where each row is a valid free-state vector, return a pandas.DataFrame which contains the parameter name and associated samples in the correct form (e.g. make_tf_array
(X)Distribute a flat tensorflow array amongst all the child parameter of this instance. objective_indices
()Method returning the indices of the model outputs which are objective functions. randomize
([distributions, skipfixed])Calls randomize on all parameters in model hierarchy. set_data
(X, Y)Update the training data of the contained models set_state
(x)Set the values of all the parameters by recursion tf_mode
()A context for building models. build_acquisition get_parameter_dict set_parameter_dict update_feed_dict -
__init__
(models)¶ Parameters: models – A list of (possibly multioutput) GPflow representing our belief of the objectives.
Pareto module¶
-
class
gpflowopt.pareto.
BoundedVolumes
(lb, ub)¶ Attributes: data_holders
Return a list of all the child DataHolders
fixed
A boolean attribute to determine if all the child parameters of this node are fixed
highest_parent
A reference to the top of the tree, usually a Model instance
long_name
This is a unique identifier for a param object within a structure, made by concatenating the names through the tree.
name
An automatically generated name, given by the reference of the _parent to this instance.
sorted_params
Return a list of all the child parameters, sorted by id.
Methods
append
(lb, ub)Add new bounded volumes. build_prior
()Build a tf expression for the prior by summing all child-parameter priors. clear
()Clears all stored bounded volumes empty
(dim, dtype)Returns an empty bounded volume (hypercube). get_feed_dict_keys
()Recursively generate a dictionary of {object: _tf_array} pairs that can be used in update_feed_dict get_free_state
()Recurse get_free_state on all child parameters, and hstack them. get_param_index
(param_to_index)Given a parameter, compute the position of that parameter on the free-state vector. get_samples_df
(samples)Given a numpy array where each row is a valid free-state vector, return a pandas.DataFrame which contains the parameter name and associated samples in the correct form (e.g. make_tf_array
(X)Distribute a flat tensorflow array amongst all the child parameter of this instance. randomize
([distributions, skipfixed])Calls randomize on all parameters in model hierarchy. set_state
(x)Set the values of all the parameters by recursion size
()return: volume of each bounded volume tf_mode
()A context for building models. get_parameter_dict set_parameter_dict update_feed_dict -
append
(lb, ub)¶ Add new bounded volumes.
Parameters: - lb – the lowerbounds of the volumes
- ub – the upperbounds of the volumes
-
clear
()¶ Clears all stored bounded volumes
-
classmethod
empty
(dim, dtype)¶ Returns an empty bounded volume (hypercube).
Parameters: - dim – dimension of the volume
- dtype – dtype of the coordinates
Returns: an empty
BoundedVolumes
-
size
()¶ Returns: volume of each bounded volume
-
class
gpflowopt.pareto.
Pareto
(Y, threshold=0)¶ Attributes: data_holders
Return a list of all the child DataHolders
fixed
A boolean attribute to determine if all the child parameters of this node are fixed
highest_parent
A reference to the top of the tree, usually a Model instance
long_name
This is a unique identifier for a param object within a structure, made by concatenating the names through the tree.
name
An automatically generated name, given by the reference of the _parent to this instance.
sorted_params
Return a list of all the child parameters, sorted by id.
Methods
bounds_2d
()Computes the cells covering the non-dominated region for the specific case of only two objectives. build_prior
()Build a tf expression for the prior by summing all child-parameter priors. divide_conquer_nd
()Divide and conquer strategy to compute the cells covering the non-dominated region. get_feed_dict_keys
()Recursively generate a dictionary of {object: _tf_array} pairs that can be used in update_feed_dict get_free_state
()Recurse get_free_state on all child parameters, and hstack them. get_param_index
(param_to_index)Given a parameter, compute the position of that parameter on the free-state vector. get_samples_df
(samples)Given a numpy array where each row is a valid free-state vector, return a pandas.DataFrame which contains the parameter name and associated samples in the correct form (e.g. hypervolume
(reference)Autoflow method to calculate the hypervolume indicator make_tf_array
(X)Distribute a flat tensorflow array amongst all the child parameter of this instance. randomize
([distributions, skipfixed])Calls randomize on all parameters in model hierarchy. set_state
(x)Set the values of all the parameters by recursion tf_mode
()A context for building models. update
([Y, generic_strategy])Update with new output data. get_parameter_dict set_parameter_dict update_feed_dict -
bounds_2d
()¶ Computes the cells covering the non-dominated region for the specific case of only two objectives.
Assumes the Pareto set has been sorted in ascending order on the first objective. This implies the second objective is sorted in descending order.
-
divide_conquer_nd
()¶ Divide and conquer strategy to compute the cells covering the non-dominated region.
Generic version: works for an arbitrary number of objectives.
-
hypervolume
(reference)¶ Autoflow method to calculate the hypervolume indicator
The hypervolume indicator is the volume of the dominated region.
Parameters: reference – reference point to use Should be equal or bigger than the anti-ideal point of the Pareto set For comparing results across runs the same reference point must be used Returns: hypervolume indicator (the higher the better)
-
update
(Y=None, generic_strategy=False)¶ Update with new output data.
Computes the Pareto set and if it has changed recalculates the cell bounds covering the non-dominated region. For the latter, a direct algorithm is used for two objectives, otherwise a generic divide and conquer strategy is employed.
Parameters: - Y – output data points
- generic_strategy – Force the generic divide and conquer strategy regardless of the number of objectives (default False)
-
gpflowopt.pareto.
non_dominated_sort
(objectives)¶ Computes the non-dominated set for a set of data points
Parameters: objectives – data points Returns: tuple of the non-dominated set and the degree of dominance, dominances gives the number of dominating points for each data point
-
Pareto.
hypervolume
(reference) Autoflow method to calculate the hypervolume indicator
The hypervolume indicator is the volume of the dominated region.
Parameters: reference – reference point to use Should be equal or bigger than the anti-ideal point of the Pareto set For comparing results across runs the same reference point must be used Returns: hypervolume indicator (the higher the better)