Releases: PedroSeber/SmartProcessAnalytics
v1.5.0: classification
Added classification models to all non-deep learning architectures (except for PLS/SPLS). These are accessible with the keyword classification = True
in SPA.main_SPA()
. The labels for model_name
do not change based on whether the user is doing regression or classification.
Also fixed some minor bugs and added citations related to this repo.
v1.4.2: more bugfixes and merged SPLS and PLS
v1.4.0: Made many changes aimed at fixing bugs (particularly a few that occurred when using MLPs, RNNs, or LCEN with lag > 0) or increasing the number of hyperparameters the user can directly pass to main_SPA(). Another important housekeeping change includes finally calling LCEN (and its hyperparameters or subvariables) as LCEN.
Gradient-Boosted Decision Trees and AdaBoost models were also added, and all tree-based and SVM models were parallelized.
Finally, more examples were added, and the old MATLAB files/code were removed.
v1.4.2: fixed some bugs, including one that appeared when using LCEN with lag > 0 and LCEN_transform_y == True. Merged SPLS and PLS, since PLS is just a special case of SPLS with eta = 0; also parallelized SPLS. Made the input variable scaling more robust and made it obey what the user requested via function inputs.
v1.4.1: bugfix and general hyperparameter reporting
v1.4.0: Made many changes aimed at fixing bugs (particularly a few that occurred when using MLPs, RNNs, or LCEN with lag > 0) or increasing the number of hyperparameters the user can directly pass to main_SPA(). Another important housekeeping change includes finally calling LCEN (and its hyperparameters or subvariables) as LCEN.
Gradient-Boosted Decision Trees and AdaBoost models were also added, and all tree-based and SVM models were parallelized.
Finally, more examples were added, and the old MATLAB files/code were removed.
v1.4.1: Fixed a bug that led to wrong reported test y_hats when using LCEN with lag > 0. Also added general hyperparameters (such as the CV type and number of folds) to the output .json file.
v1.4.0: tree models, more parallelization, housekeeping
Made many changes aimed at fixing bugs (particularly a few that occurred when using MLPs, RNNs, or LCEN with lag > 0) or increasing the number of hyperparameters the user can directly pass to main_SPA()
. Another important housekeeping change includes finally calling LCEN (and its hyperparameters or subvariables) as LCEN.
Gradient-Boosted Decision Trees and AdaBoost models were also added, and all tree-based and SVM models were parallelized.
Finally, more examples were added, and the old MATLAB files/code were removed.
v1.3.1: bugfixes and code style improvements
v1.3: Created LCEN, a variable selection algorithm used with the expanded feature set of (D)ALVEN to create nonlinear, interpretable ML models. This new algorithm is much better at selecting the correct variables, ignoring irrelevant variables, and producing models with lower errors and greater stability.
Also added new examples that better highlight how to use SPA.
v1.3.1: Primarily bugfixes and code style changes. Made ace-cream an optional dependency. Fixed some (D)ALVEN variable unscaling issues.
v1.3: LCEN: a new variable selection algorithm for nonlinear, interpretable models
Created LCEN, a variable selection algorithm used with the expanded feature set of (D)ALVEN to create nonlinear, interpretable ML models. This new algorithm is much better at selecting the correct variables, ignoring irrelevant variables, and producing models with lower errors and greater stability.
Also added new examples that better highlight how to use SPA.
v1.2: Fewer dependencies + MLPs and RNNs
Eliminated R and TensorFlow dependencies. Added support for MLPs and RNNs, including support for automatic hyperparameter expansion during cross-validation. Miscellaneous bug fixes.
v1.1: For the "Linear and Neural Network Models for Predicting N-glycosylation in Chinese Hamster Ovary Cells Based on B4GALT Levels" publication.
For the "Linear and Neural Network Models for Predicting N-glycosylation in Chinese Hamster Ovary Cells Based on B4GALT Levels" publication. See https://github.com/PedroSeber/CHO_N-glycosylation_prediction