gp
: Breaking change - Change GP training Rust API to use one dimensional array by @relf in #222doe
: Fix optimized LHS ESE algorithm by @relf in #219egobox::Gpx
: Exception raised if training output data is not one-dimensional by @relf in #218moe
: Refactorpredict_smooth
by @relf in #221- Update to numpy crate 0.22.1 to fix win32 compilation by @relf in #216
moe
: Save/Load surrogates in binary format by @relf in #213- Fix badge link by @relf in #201
- Upgrade to PyO3 0.22 by @relf in #203
- Add EGObox logo by @relf in #193
gp
: Add training infos getters at Python level by @relf in #196ego
:
ego
:- Maintainance by @relf in #183
gp
: Fix variance gradient computation by @relf in #177
ego
:- Implement TREGO algorithm by @relf in #173
- Fix added point count in TREGO local step by @relf in #174
- Fix WB2S criteria scaling factor and fmin computation by @relf in #175
- Prepare release 0.21 by @relf in #176
gp
:ego
:- Add dependabot cargo ecosystem check by @relf in #163
ego
:gp
:- Allow fixed hyperparameters theta for GP and Sparse GP by @relf in #155
egobox
:
- Fix GP mixture with kpls option on Griewank test function by @relf in #150
- [Breaking changes]
gp
,moe
,egobox
(Python): Renamepredict_derivatives()
aspredict_gradients()
by @relf in #148
- [Breaking changes]
gp
API renaming by @relf in #145predict_values()
is renamedpredict()
predict_variances()
is renamedpredict_var()
predict_variance_derivatives()
is renamedpredict_var_derivatives()
Derivatives predictions (predict_derivatives()
andpredict_var_derivatives()
) are made available in Python.
- Refactor Mixture of Experts by @relf in #146 Factorize code between full GP and sparse GP implementations
- Add
Gpx
accessors by @relf in #140 - Fix
LHS
maximin bug by @relf in #141 doe
: Improve classic, centered and maximin LHS performances by @relf in #138doe
: Improve optimized LHS performances (1.25x speedup) by @relf in #136- Rework (mostly internal) API to avoid awkward &Option by @relf in #134
- Add Python bindings for all LHS flavours by @relf in #135
gp
: Implement sparse gaussian process methods (cf.SparseGaussianProcess
)- Python binding:
SparseGpMix
, see doc/tutorial - GP/SGP API
- hyperparameter tuning : initial theta guess and bounds can be specified (
theta_init
,theta_bounds
) n_start
controls the number of optimization multistart
- hyperparameter tuning : initial theta guess and bounds can be specified (
- In GP/SGP
rayon
is used to make parallel optimization multistart
ego
: Fix ask-and-tell interfacesuggest()
method in presence of discrete variable to work in discrete not in continuous space A few API breaking changes:EgorConfig::xtypes
not an option anymoreEgorSolver::new_with_xtypes()
renamednew
asnew
with xlimits is removed, useto_xtypes
to convertxlimits
EgorConfig::no_discrete
attribute removed, useEgorConfig::discrete()
methodSurrogateBuilder::new_with_xtypes_rng
renamednew_with_xtypes
ego
: API refactoring to enableask-and-tell
interface- Configuration of Egor is factorize out in
EgorConfig
EgorBuilder
gets aconfigure
method to tune the configurationEgorService
structure representEgor
when used as service- Python
Egor
API changes:- function under optimization is now given via
minimize(fun, max_iters-...)
method - new method
suggest(xdoe, ydoe)
allows to ask for x suggestion and tell current function evaluations - new method
get_result(xdoe, ydoe)
to get the best evaluation (ie minimum) from given ones
- function under optimization is now given via
- Configuration of Egor is factorize out in
gp
uses pure Rust COBYLA by @relf in #110ego
as pure Rust implementation (nlopt
is now optional) by @relf in #112egobox
Python module: Simplify mixed-integer type declaration by @relf in #115- Upgrade dependencies by @relf in #114
- Upgrade edition 2021 by @relf in #109
- CI maintainance by @relf in #111
- Bump actions/checkout from 2 to 4 by @dependabot in #107
- Bump actions/setup-python from 2 to 4 by @dependabot in #108
- Automate Python package build and upload on Pypi from Github CI by @relf in #104
- Fix FullFactorial when asked nb iof samples is small wrt x dimension by @relf in #105
- Make mixed-integer sampling methods available in Python by @relf in #106
-
gp
,moe
andegobox
Python module:- Added Gaussian process sampling (#97)
- Added string representation (#98)
-
egobox
Python module:- Change recombination enum to respect Python uppercase convention (#98)
-
Notebooks and documentation updates (#97, #98, #99)
-
ego
:- Infill criterion is now a trait object in
EgorSolver
structure (#92) Egor
andEgorSolver
API: methods taking argument of type Option<T> now take argument of type T (#94)EgorBuilder::min_within_mixed_space()
is nowEgorBuilder::min_within_mixint_space()
(#96)egobox-ego
library doc updated (#95)
- Infill criterion is now a trait object in
-
egobox
Python module: Upgrade to PyO3 0.18 (#91)
ego
:- Fix Egor solver best iter computation (#89)
ego
:- Make objective and constraints training in parallel (#86)
- Lock mopta execution to allow concurrent computations (#84)
- Fix and adjust infill criterion optimmization retries strategy (#87)
moe
:- Fix k-fold cross-validation (#85)
ego
:- Renaming
XType
,XSpec
for consistency (#82) - Export history in optimization result (#81)
- Use nb iter instead of nb eval, rename q_parallel as q_points (#79)
- Warn when inf or nan detected during obj scaling computation (#78)
- Parallelize constraint scales computations (#73)
- Parallelize multistart optimizations (#76)
- Handle GMM errors during MOE training (#75)
- Handle possible errors from GMM clustering (#74)
- Upgrade argmin 0.8.0 (#72)
- Add mopta08 test case as example (#71)
- Fix scaling check for infinity (#70)
- Use kriging surrogate by default (#69)
- Renaming
gp
:- Add analytic derivatives computations (#54, #55, #56, #58, #60). All derivatives available for all mean/correlation models are implemented.
- Refactor
MeanModel
andCorrelationModel
methods:apply()
renamed tovalue()
jac()
renamed tojacobian()
- Fix prediction computation when using linear regression (#52)
ego
:- Refactor
Egor
usingargmin 0.7.0
solver frameworkEgorSolver
can be used withargmin::Executor
and benefit from observers and checkpointing features (#67) Egor
use kriging setting by default (i.e. one cluster with constant mean and squared exponential correlation model)
- Refactor
- Add notebook on Manuau Loa CO2 example to show
GpMix
/Gpx
surrogate model usage (#62) - Use xoshiro instead of isaac random generator (#63)
- Upgrade
ndarray 0.15
,linfa 0.6.1
,PyO3 0.17
(#57, #64)
gp
: Kriging derivatives predictions are implemented (#44, #45), derivatives for Gp with linear regression are implemented (#47)predict_derivatives
: prediction of the output derivatives y wtr the input xpredict_variance_derivatives
: prediction of the derivatives of the output variance wrt the input x
moe
: as forgp
, derivatives methods for smooth and hard predictions are implemented (#46)ego
: when available derivatives are used to optimize the infill criterion with slsqp (#44)egobox
Python binding: addGpMix
/Gpx
in Pythonegobox
module, the Python binding ofegobox-moe::Moe
(#31)
- Add Egor
minimize
interruption capability (Ctrl+C) from Python (#30) - Minor performance improvement in moe clustering (#29)
- Improvements following JOSS submission review (#34, #36, #38, #39, #40, #42)
- Generate Python
egobox
module for Linux (#20) - Improve
Egor
robustness by adding LHS optimization (#21) - Improve
moe
with automatic number of clusters determination (#22) - Use
linfa 0.6.0
making BLAS dependency optional (#23) - Improve
Egor
by implementing automatic reclustering every 10-points addition (#25) - Fix
Egor
parallel infill strategy (qEI): bad objectives and constraints gp models updste (#26)
Improve mixture of experts (#15)
- Implement moe save/load (feature persistent)
- Rename GpSurrogate to Surrogate
- Remove
fit_for_predict
- Implement
ParamGuard
forMoeParams
- Implement
Fit
forMoeParams
- Rename
MoeParams
setters
Refactor moe
/ego
relation (#16)
- Move
MoeFit
asSurrogateBuilder
frommoe
toego
- Implement
SurrogateBuilder
forMoe
Moe
useslinfa::Fit
trait- Rename
Evaluator
asPreProcessor
Refactor MixintEgor
(#17)
- Rename
PreProcessor::eval
torun
- Implement
linfa::Fit
forMixintMoeParams
, uselinfa::Dataset
- Rename
SurrogateParams
toMoeBuilder
- Rename
n_parallel
toq_parallel
(qEI stategy)
- Improve documentation
egobox
Python module: rename egoboxOptimizer
class toEgor
- Add hot start
- Add constraint handling
- Add mixed-integer optimization capability
- Add Python binding with PyO3
Initial release
doe
:LHS
,FullFactorial
,Random sampling
gp
: Gaussian Process models with 3 regression models (constant, linear quadratic) and 4 correlation models (squared exponential, absolute exponential, matern32, matern52)moe
: Mixture of Experts: find the bests mix of gps given a number of clusters regarding smooth or hard recombinationego
: Contains egor optimizer which is a super EGO algorithm implemented on top of the previous elements. It implements several infill strategy: EI, WB2, WB2S and use either COBYLA or SLSQP for internal optimization.