You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This package provides hyperband tuning for [`mlr3`](https://mlr3.mlr-org.com).
23
+
`mlr3hyperband` extends the [mlr3tuning](https://mlr3tuning.mlr-org.com/) package with multifidelity optimization methods based on the successive halving algorithm.
24
+
It currently provides the following optimizers for [bbotk](https://bbotk.mlr-org.com/) and tuner for [mlr3tuning](https://mlr3tuning.mlr-org.com/):
* mlr3book chapter on [hyperband](https://mlr3book.mlr-org.com/optimization.html#hyperband)
32
+
and [hyperparameter tuning](https://mlr3book.mlr-org.com/optimization.html#tuning).
33
+
* The original publications introducing the [successive halving](https://arxiv.org/abs/1502.07943) and
34
+
[hyperband](https://arxiv.org/abs/1603.06560).
35
+
* Ask questions on [Stackoverflow (tag #mlr3)](https://stackoverflow.com/questions/tagged/mlr3)
24
36
25
37
## Installation
26
38
@@ -36,26 +48,10 @@ Install the development version from GitHub:
36
48
remotes::install_github("mlr-org/mlr3hyperband")
37
49
```
38
50
39
-
## Resources
40
-
41
-
* mlr3book chapter on [hyperband](https://mlr3book.mlr-org.com/optimization.html#hyperband)
42
-
and [hyperparameter tuning](https://mlr3book.mlr-org.com/optimization.html#tuning).
43
-
* The original [paper](https://arxiv.org/abs/1603.06560) introducing the hyperband algorithm.
44
-
45
-
## Hyperband
46
-
47
-
Hyperband is a budget oriented-procedure, weeding out suboptimally performing configurations early on during their training process aiming at increasing the efficiency of the tuning procedure.
48
-
For this, several brackets are constructed with an associated set of configurations for each bracket.
49
-
These configuration are initialized by stochastic, often uniform, sampling.
50
-
Each bracket is divided into multiple stages, and configurations are evaluated for a increasing budget in each stage.
51
-
Note that currently all configurations are trained completely from the beginning, so no online updates to the models are performed.
52
-
53
-
Different brackets are initialized with different number of configurations, and different budget sizes.
54
-
To identify the budget for evaluating hyperband, the user has to specify explicitly which hyperparameter of the learner influences the budget by tagging a single hyperparameter in the parameter set with `"budget"`.
55
-
An alternative approach using subsampling and pipelines is described further below.
56
-
57
51
## Examples
58
52
53
+
### Basic
54
+
59
55
```{r, include = FALSE}
60
56
# mute load messages
61
57
library(bbotk)
@@ -64,53 +60,24 @@ library(mlr3hyperband)
64
60
library(mlr3learners)
65
61
```
66
62
67
-
### Basic
68
-
69
-
If you are already familiar with `mlr3tuning`, then the only change compared to other tuners is to give a numeric hyperparameter a `budget` tag.
70
-
Afterwards, you can handle hyperband like all other tuners.
71
-
Originally, hyperband was created with a “natural” learning parameter as the budget parameter in mind, like `nrounds` of the XGBoost learner.
0 commit comments