A repository for a data-free prototype-based generative classifier with latent distillation for time series class-incremental learning. This repository is built on top of the unified TSCIL (Time Series Class-Incremental Learning) framework.
- more datasets are coming
- [GCPP] (data-free prototype-based generative classifier with latent distilation)
Regularization-based:
Replay-based:
Our implementation uses the source code from the following repositories:
- TSCIL: Class-incremental Learning for Time Series: Benchmark and Evaluation
- CIL: Class-incremental Learning with Generative Classifiers
- Framework & Buffer & LwF & ER & ASER: Online Continual Learning in Image Classification: An Empirical Survey
- EWC & SI & MAS: Avalanche: an End-to-End Library for Continual Learning
- DER: Mammoth - An Extendible (General) Continual Learning Framework for Pytorch
- DeepInversion: Dreaming to Distill: Data-free Knowledge Transfer via DeepInversion
- Herding & Mnemonics: Mnemonics Training: Multi-Class Incremental Learning without Forgetting
- Soft-DTW: Soft DTW for PyTorch in CUDA
- CNN: AdaTime: A Benchmarking Suite for Domain Adaptation on Time Series Data
- TST & lr scheduler: PatchTST: A Time Series is Worth 64 Words: Long-term Forecasting with Transformers
- Generator: TimeVAE for Synthetic Timeseries Data Generation