Skip to content

Releases: GAMES-UChile/mogptk

v0.5.1

v0.5.0

10 Dec 02:49
Compare
Choose a tag to compare
  • Many many bug fixes
  • Revise all examples in the documentation
  • Revert to default float64 dtype instead of float32 to avoid precision errors
  • Improve verbose training output
  • Improve plots slightly
  • Add mogptk.gpr.MultiOutputMean, support for different mean functions for each output dimension
  • Make all randomness come from PyTorch and not also numpy
  • Revert memory optimization for exact model to avoid Cholesky problems
  • Add prediction confidence intervals and likelihood sampling

v0.4.0

v0.3.5

06 Dec 19:45
Compare
Choose a tag to compare

v0.3.4

v0.3.2

v0.3.1

17 Jul 20:02
Compare
Choose a tag to compare
  • Fix conversions to/from GPU
  • Fix error on plot_losses()
  • Rename gpr.PhiKernel as gpr.FunctionKernel
  • Add kernel shortcuts such as mogptk.Kernels.SpectralMixture
  • Include end point when calling Data.remove_range()
  • Fix input dimensions for AddKernel and MulKernel
  • Add sigma and figsize arguments to Model.plot_prediction()

v0.3.0

01 Jun 14:29
Compare
Choose a tag to compare

Features

  • Support for variational and sparse models
  • Support for multi output (heterogeneous) likelihoods, i.e. different likelihoods for each channel
  • New models: Snelson, OpperArchambeau, Titsias, Hensman
  • New kernels: Constant, White, Exponential, LocallyPeriodic, Cosine, Sinc
  • New likelihoods: StudentT, Exponential, Laplace, Bernoulli, Beta, Gamma, Poisson, Weibull, LogLogistic, LogGaussian, ChiSquared
  • New mean functions: Constant and Linear
  • Allow kernels to be added and multiplied (i.e. K1 + K2 or K1 * K2)
  • Data and DataSet now accept more data types as input, such as pandas series
  • Data, DataSet, and Model plot functionalities return the figure and axes to allow customization
  • Support sampling (prior or posterior) from the model
  • Add the MOHSM kernel: multi-output harmonic spectral mixture kernel (Altamirano 2021)
  • Parameters can be pegged to other parameters, essentially removing them from training
  • Exact model supports training with known data point variances and draw their error bars in plots

Improvements

  • Jitter added to the diagonal before calculating the Cholesky is now relative to the average value of the diagonal, this improves numeric stability for all kernels irrespective of the actual numerical magnitude of the values
  • Kernels now implement K_diag that returns the kernel diagonal for better performance
  • BNSE initialization method has been reimplemented with improved performance and stability
  • Parameter initialization for all models from different initialization methods has been much improved
  • Induction point initialization now support random or grid or density
  • SpectralMixture (in addition to Spectral), MultiOutputSpectralMixture (in addition to MultiOutputSpectral) with higher performance
  • Allow mixing of single-output and multi-output kernels using active
  • All plotting functions have been restyled
  • Model training allows custom error function for calculation at each iteration
  • Support single and cross lengthscales for the SquaredExponential, RationalQuadratic, Periodic, LocallyPeriodic kernels
  • Add AIC and BIC methods to model
  • Add model.plot_correlation()

Changes

  • Remove rescale_x
  • Parameter.trainable => Parameter.train
  • Kernels are by default initialized deterministically and not random, however the models (MOSM, MOHSM, CONV, CSM, SM-LMC, and SM) are still initialized randomly by default
  • Plotting predictions happens from the model no the data: model.plot_prediction() instead of model.predict(); data.plot()

v0.2.5

08 Sep 00:00
Compare
Choose a tag to compare

Bug fixes

v0.2.4

26 Jul 13:59
Compare
Choose a tag to compare
  • Set maximum frequency to Nyquist in MOSM, CSM, SM-LMC, and SM; fixes #21
  • Improve CholeskyException messaging
  • Update the GONU example
  • Fix Sigmoid.backward, fixes #25
  • Add support for multiple input dimensions for remove_range, fixes #24
  • Fix SM model initialization for IPS
  • Data now permits different dtypes per input dimension for X, LoadFunction now works for multi input dimensions, upgrading time delta for datetime64 now fixed
  • Change X from (n,input_dims) to [(n,)] * input_dims
  • Add dim to functions to specify input dimension
  • Fix example 06
  • Fix old import path, fixes #27
  • Reuse torch.eye in log_marginal_likelihood
  • Make rescale_x optional for models, see #28; return losses and errors from train()