Skip to content

Commit 24a279a

Browse files
Docs updates for kernel and acq func
1 parent de06e69 commit 24a279a

File tree

3 files changed

+68
-4
lines changed

3 files changed

+68
-4
lines changed

docs/user_guide/customizing_gp_acq.md

Lines changed: 56 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@
33
BOA is designed with flexibility of options for selecting the acquisition function and
44
kernel. BOA could be used by advanced BO users that want to control detailed aspects of the
55
optimization. However, for non-domain experts of BO, BOA defaults to common sensible
6-
choices. BOA defers to BoTorch's defaults of a Matern 5/2 kernel, one of the most widely
6+
choices. BOA defers to BoTorch's defaults of a Matern 5/2 kernel, one of the most widely
77
used and flexible choices for BO and GPs (Frazier, 2018; Riche and Picheny, 2021; Jakeman,
88
2023). This is considered to be a flexible and broadly applicable kernel (Riche and Picheny,2021) and it is used as the default by many other BO and GP toolkits (Akiba et al., 2019;
99
Balandat et al., 2020; Brea, 2023; Nogueira, 2014; Jakeman, 2023). Similarly, BOA defaults
@@ -38,6 +38,61 @@ generation_strategy:
3838
3939
## Utilizing Ax's Predefined Kernels and Acquisition Functions
4040
41+
Ax has a number of predefined kernels and acquisition function combos that can be used in the optimization process. Each of these sit inside a "step" inside the generation strategy, where your optimization is broken into a number of "steps" and each step can have its own kernel and acquisition function. For example, the first step is usually a Sobol step that does a quasi-random initialization of the optimization process. The second step could be a "GPEI" step (GPEI is the Ax model class name, and is the default used for single objective optimization) that uses the Matern 5/2 kernel and the batched noisy Expected Improvement acquisition function.
42+
43+
```yaml
44+
45+
generation_strategy:
46+
steps:
47+
- model: Sobol
48+
num_trials: 50
49+
# specify the maximum number of trials to run in parallel
50+
#
51+
max_parallelism: 10
52+
- model: GPEI
53+
num_trials: -1 # -1 means the rest of the trials
54+
max_parallelism: 10
55+
```
56+
57+
Ax does not have a good spot in their docs currently that lists all the available kernels and acquisition functions combo models, but you can find them listed on their [api docs here](https://ax.dev/api/modelbridge.html#ax.modelbridge.registry.Models) and you can see the source code for the models by clicking the source link on the api docs page. Some of the available models are:
58+
59+
- `GPEI`: Gaussian Process Expected Improvement, the default for single objective optimization, uses the Matern 5/2 kernel
60+
- `GPKG`: Gaussian Process Knowledge Gradient, uses the Matern 5/2 kernel
61+
- `SAASBO`: Sparse Axis-Aligned Subspace Bayesian Optimization, see [BO Overview High Dimensionality](bo_overview.md#high-dimensionality) for more details, uses the Matern 5/2 kernel and the batched noisy Expected Improvement acquisition function
62+
- `Sobol`: Sobol initialization
63+
- `MOO`: Gaussian Process Expected Hypervolume Improvement, uses the Matern 5/2 kernel
64+
65+
If you want to specify your kernel and acquisition function, you can do so by creating a custom model. The way to do that is with the `BOTORCH_MODULAR` model. This model allows you to specify the kernel and acquisition function you want to use. Here is an example of how to use the `BOTORCH_MODULAR` model:
66+
67+
```yaml
68+
generation_strategy:
69+
steps:
70+
- model: SOBOL
71+
num_trials: 5
72+
- model: BOTORCH_MODULAR
73+
num_trials: -1 # No limitation on how many trials should be produced from this step
74+
model_kwargs:
75+
surrogate:
76+
botorch_model_class: SingleTaskGP # BoTorch model class name
77+
covar_module_class: RBFKernel # GPyTorch kernel class name
78+
mll_class: LeaveOneOutPseudoLikelihood # GPyTorch MarginalLogLikelihood class name
79+
botorch_acqf_class: qUpperConfidenceBound # BoTorch acquisition function class name
80+
acquisition_options:
81+
beta: 0.5
82+
```
83+
84+
In the above example, the `BOTORCH_MODULAR` model is used to specify the `SingleTaskGP` model class, the `RBFKernel` kernel class, and the `qUpperConfidenceBound` acquisition function class. The `qUpperConfidenceBound` acquisition function is a batched version of UpperConfidenceBound. The `beta` parameter is a hyperparameter of the acquisition function that controls the trade-off between exploration and exploitation.
85+
86+
BoTorch model classes can be found in the [BoTorch model api documentation](https://botorch.org/docs/models) and the BoTorch acquisition functions can be found in the [BoTorch acquisition api documentation](https://botorch.org/api/acquisition.html).
87+
88+
GPyTorch kernel classes can be found in the [GPyTorch kernel api documentation](https://gpytorch.readthedocs.io/en/latest/kernels.html).
89+
90+
The GPyTorch MarginalLogLikelihood classes can be found in the [GPyTorch MarginalLogLikelihood api documentation](https://gpytorch.readthedocs.io/en/latest/marginal_log_likelihoods.html). But the only MLL class that for sure work currently are `ExactMarginalLogLikelihood` and `LeaveOneOutPseudoLikelihood`. Other MLL classes may work, but they have not been tested and are depended on some other implementation details in Ax.
91+
92+
93+
```{caution}
94+
The `BOTORCH_MODULAR` class is an area of Ax's code that is still under active development and a lot of components of it are very dependent on the current implementation of Ax, BoTorch, and GPyTorch, and therefore it is impossible to test every possible combination of kernel and acquisition function. Therefore, it is recommended to use when possible the predefined models that Ax provides.
95+
```
4196

4297

4398

docs/user_guide/package_overview.rst

Lines changed: 11 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -26,8 +26,6 @@ Objective functions
2626

2727
When specifying your objective function to minimize or maximize, :doc:`BOA </index>` comes with a number of metrics you can use with your model output, such as MSE, :math:`R^2`, and others. For a list of current list of premade available of metrics, see See :mod:`.metrics.metrics`
2828

29-
30-
3129
************************************************************************
3230
Creating a model wrapper (Language Agnostic or Python API)
3331
************************************************************************
@@ -39,6 +37,17 @@ and there is a standard interface to follow.
3937

4038

4139
See the :mod:`instructions for creating a model wrapper <.boa.wrappers>` for details.
40+
See the :doc:`examples of model wrappers <.boa.wrappers>` for examples.
41+
See :doc:`tutorials </examples/index>` for a number of examples of model wrappers in both Python and R.
42+
43+
44+
************************************************************************
45+
Choosing a Custom Kernel and Acquisition Function
46+
************************************************************************
47+
48+
BOA tries to make it easy to use the default kernel and acquisition function, but if you need to specify a different kernel or acquisition function, you can do so in the configuration file.
49+
50+
See :doc:`details on how to specify kernel and acquisition function <customizing_gp_acq.>` for details.
4251

4352
****************************************************
4453
Creating a Python launch script (Usually Not Needed)

tests/scripts/other_langs/r_package_streamlined/config_modular_botorch.yaml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ generation_strategy:
5454
botorch_model_class: SingleTaskGP # BoTorch model class name
5555

5656
covar_module_class: RBFKernel # GPyTorch kernel class name
57-
mll_class: LeaveOneOutPseudoLikelihood
57+
mll_class: LeaveOneOutPseudoLikelihood # GPyTorch MarginalLogLikelihood class name
5858
botorch_acqf_class: qUpperConfidenceBound # BoTorch acquisition function class name
5959
acquisition_options:
6060
beta: 0.5

0 commit comments

Comments
 (0)