Skip to content

carefree-learn 0.1.8

Compare
Choose a tag to compare
@carefree0910 carefree0910 released this 16 Dec 11:55
· 3354 commits to dev since this release

Release Notes

carefree-learn 0.1.8 mainly registered all PyTorch schedulers and enhanced mlflow integration.

Backward Compatible Breaking

carefree-learn now keeps a copy of the orignal user defined configs (#48), which changes the saved config file:

v0.1.7 (config.json) v0.1.8 (config_bundle.json)
{
    "data_config": {
        "label_name": "Survived"
    },
    "cuda": 0,
    "model": "tree_dnn"
    // the `binary_config` was injected into `config.json`
    "binary_config": {
        "binary_metric": "acc",
        "binary_threshold": 0.49170631170272827
    }
}
{
    "config": {
        "data_config": {
            "label_name": "Survived"
        },
        "cuda": 0,
        "model": "tree_dnn"
    },
    "increment_config": {},
    "binary_config": {
        "binary_metric": "acc",
        "binary_threshold": 0.49170631170272827
    }
}

New Schedulers

carefree-learn newly supports the following schedulers based on PyTorch schedulers:

These schedulers could be utilized easily with scheduler=... specified in any high-level API in carefree-learn, e.g.:

m = cflearn.make(scheduler="cyclic").fit(x, y)

Better mlflow Integration

In order to utilize mlflow better, carefree-learn now handles some better practices for you under the hood, e.g.:

  • Makes the initialization of mlflow multi-thread safe in distributed training.
  • Automatically handles the run_name in distributed training.
  • Automatically handles the parameters for log_params.
  • Updates the artifacts in periodically.

The (brief) documentation for mlflow Integration could be found here.