We are excited to announce the release of XGBoostLSS v0.4.0! This release brings a new feature, package updates, stability improvements, and bug fixes. Here are the key highlights of this release:
New Features
Mixture Distributions
XGBoostLSS now supports using mixture distributions for modelling univariate targets! Mixture densities, or mixture distributions, extend the concept of traditional univariate distributions by interpreting the observed data as combinations of multiple underlying processes. Due to their high flexibility, mixture densities can portray a diverse range of shapes, making them adaptable to a plethora of datasets. By introducing mixture densities in XGBoostLSS, users get a better understanding of the conditional distribution of the response variable and achieve a more precise representation of the data generation process.
Stability Improvements
Model Estimation
We have improved the stability of the model estimation process. This results in more consistent and accurate estimation of parameters, leading to better predictions and increased model reliability.
Bug Fixes
In addition to the new features and stability improvements, we have addressed various bugs reported by the community. These bug fixes enhance the overall reliability and usability of XGBoostLSS.
Package Dependency Updates
We have updated some of the package-dependencies to the latest versions.
General
We appreciate the valuable feedback and contributions from our users, which have helped us in making XGBoostLSS even better. We encourage you to update to this latest version to take advantage of the new features and improvements. To get started, check out the documentation and examples.
Thank you for your continued support, and we look forward to your feedback.
Happy modeling!