Skip to content

Commit

Permalink
Prep for release v0.2.0 (#172)
Browse files Browse the repository at this point in the history
Prep for release v0.2.0 ...

Co-authored-by: Amirhessam Tahmassebi <admin@slickml.com>
  • Loading branch information
amirhessam88 and Amirhessam Tahmassebi authored Nov 28, 2022
1 parent 664df08 commit 6c9dda4
Show file tree
Hide file tree
Showing 21 changed files with 2,204 additions and 8,644 deletions.
10 changes: 9 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,20 @@
- Please checkout [SlickML Official Releases](https://github.com/slickml/slick-ml/releases) for more details.


---
## 📍 Unreleased Version X.X.X - XXXX-XX-XX
### 🛠 Fixed

### 🔥 Added
---
## 📍 Version 0.2.0 - 2022-11-27

### 🛠 Fixed
- [#170](https://github.com/slickml/slick-ml/pull/170) enabled more `flake8` plugins and fixed `poe check` command and `mypy` dependencies.
- [#169](https://github.com/slickml/slick-ml/pull/169) refactored `XGBoostHyperOptimizer` class.

### 🔥 Added
- [#171](https://github.com/slickml/slick-ml/pull/171) added `type-stubs` and rolled out type checking with `mypy` across library.

---

## 📍 Version 0.2.0-beta.2 - 2022-11-13
Expand Down
6 changes: 4 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@
[![codecov](https://codecov.io/gh/slickml/slick-ml/branch/master/graph/badge.svg?token=Z7XP51MB4K)](https://codecov.io/gh/slickml/slick-ml)
![dependencies](https://img.shields.io/librariesio/github/slickml/slick-ml)
[![license](https://img.shields.io/github/license/slickml/slick-ml)](https://github.com/slickml/slick-ml/blob/master/LICENSE/)
[![downloads](https://pepy.tech/badge/slickml)](https://pepy.tech/project/slickml)
![pypi_version](https://img.shields.io/pypi/v/slickml)
![python_version](https://img.shields.io/pypi/pyversions/slickml)
[![downloads](https://pepy.tech/badge/slickml)](https://pepy.tech/project/slickml)
[![slack_invite](https://badgen.net/badge/Join/SlickML%20Slack/purple?icon=slack)](https://www.slickml.com/slack-invite)
![twitter_url](https://img.shields.io/twitter/url?style=social&url=https%3A%2F%2Ftwitter.com%2FSlickML)

Expand Down Expand Up @@ -43,7 +43,9 @@ good portion of the tasks based on tabular data can be addressed via gradient bo
linear models<sup>[1](https://arxiv.org/pdf/2207.08815.pdf)</sup>. SlickML provides Data Scientists
with a toolbox to quickly prototype solutions for a given problem with minimal code while maximizing
the amount of information that can be inferred. Additionally, the prototype solutions can be easily
promoted and served in production with our recommended recipes.
promoted and served in production with our recommended recipes via various model serving frameworks
including [ZenML](https://github.com/zenml-io/zenml), [BentoML](https://github.com/bentoml/BentoML),
and [Prefect](https://github.com/PrefectHQ/prefect). More details coming soon 🤞 ...


## 📖 Documentation
Expand Down
587 changes: 319 additions & 268 deletions examples/quick-starts/classification/GLMNetCVClassifier.ipynb

Large diffs are not rendered by default.

729 changes: 396 additions & 333 deletions examples/quick-starts/classification/XGBoostCVClassifier.ipynb

Large diffs are not rendered by default.

698 changes: 358 additions & 340 deletions examples/quick-starts/classification/XGBoostClassifier.ipynb

Large diffs are not rendered by default.

729 changes: 378 additions & 351 deletions examples/quick-starts/metrics/BinaryClassificationMetrics.ipynb

Large diffs are not rendered by default.

115 changes: 61 additions & 54 deletions examples/quick-starts/metrics/RegressionMetrics.ipynb

Large diffs are not rendered by default.

23 changes: 15 additions & 8 deletions examples/quick-starts/optimization/XGBoostBayesianOptimizer.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@
"output_type": "stream",
"name": "stdout",
"text": [
"Loaded SlickML Version = 0.2.0b1\n"
"Loaded SlickML Version = 0.2.0\n"
]
}
],
Expand All @@ -56,11 +56,10 @@
" | \n",
" | XGBoost Hyper-Parameters Tuner using Bayesian Optimization.\n",
" | \n",
" | This is wrapper using Bayesian Optimization algorithm [bayesian-optimization]_to tune the\n",
" | This is wrapper using Bayesian Optimization algorithm [bayesian-optimization]_ to tune the\n",
" | hyper-parameter of XGBoost [xgboost-api]_ using ``xgboost.cv()`` functionality with n-folds\n",
" | cross-validation iteratively. This feature can be used to find the set of optimized set of\n",
" | hyper-parameters for both classification and regression tasks.\n",
" | the optimized set of parameters before training.\n",
" | \n",
" | Notes\n",
" | -----\n",
Expand Down Expand Up @@ -91,7 +90,7 @@
" | objective : str, optional\n",
" | Objective function depending on the task whether it is regression or classification. Possible\n",
" | objectives for classification ``\"binary:logistic\"`` and for regression ``\"reg:logistic\"``,\n",
" | ``\"reg:squaredlogerror\"``, and ``\"reg:squaredlogerror\"``, by default \"binary:logistic\"\n",
" | ``\"reg:squarederror\"``, and ``\"reg:squaredlogerror\"``, by default \"binary:logistic\"\n",
" | \n",
" | acquisition_criterion : str, optional\n",
" | Acquisition criterion method with possible options of ``\"ei\"`` (Expected Improvement),\n",
Expand Down Expand Up @@ -223,10 +222,10 @@
" | \n",
" | Parameters\n",
" | ----------\n",
" | X_train : Union[pd.DataFrame, np.ndarray]\n",
" | X : Union[pd.DataFrame, np.ndarray]\n",
" | Input data for training (features)\n",
" | \n",
" | y_train : Union[List[float], np.ndarray, pd.Series]\n",
" | y : Union[List[float], np.ndarray, pd.Series]\n",
" | Input ground truth for training (targets)\n",
" | \n",
" | Returns\n",
Expand Down Expand Up @@ -254,7 +253,7 @@
" | -------\n",
" | BayesianOptimization\n",
" | \n",
" | get_params_bounds(self) -> Dict[str, Tuple[Union[int, float], Union[int, float]]]\n",
" | get_params_bounds(self) -> Optional[Dict[str, Tuple[Union[int, float], Union[int, float]]]]\n",
" | Returns the hyper-parameters boundaries for the tuning process.\n",
" | \n",
" | Returns\n",
Expand Down Expand Up @@ -812,7 +811,7 @@
"output_type": "execute_result",
"data": {
"text/plain": [
"<bayes_opt.bayesian_optimization.BayesianOptimization at 0x12dd73c10>"
"<bayes_opt.bayesian_optimization.BayesianOptimization at 0x1319f69a0>"
]
},
"metadata": {},
Expand Down Expand Up @@ -1276,6 +1275,14 @@
}
],
"metadata": {}
},
{
"cell_type": "markdown",
"source": [
"### Feel free to add your favorite `Example` via a `pull-request`.\n",
"### More details can be found in our [Contributing Document](https://github.com/slickml/slick-ml/blob/master/CONTRIBUTING.md)."
],
"metadata": {}
}
],
"metadata": {
Expand Down
Loading

0 comments on commit 6c9dda4

Please sign in to comment.