Official Repository for the NeurIPS 2024 paper Rough Transformers: Lightweight and Continuous Time Series Modelling through Signature Patching.
(Note: The code will undergo some refactoring in the near future.)
Please, if you use this code, cite the published paper in the Proceedings of NeurIPS 2024:
@inproceedings{morenorough,
title={Rough Transformers: Lightweight and Continuous Time Series Modelling through Signature Patching},
author={Moreno-Pino, Fernando and Arroyo, Alvaro and Waldon, Harrison and Dong, Xiaowen and Cartea, Alvaro},
booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems}
}
To ensure compatibility, the repository includes a rformer.yml
file for creating a conda environment with all necessary dependencies.
To set up the environment:
conda env create -f rformer.yml
conda activate rformer
First, clone the repository:
git clone https://github.com/AlvaroArroyo/RFormer.git
cd RFormer
The paper includes experiments on both synthetic (src/other
) and UEA datasets (src/UEA
).
To train the model on synthetic data:
python src/other/main_classification_synthetic_long.py
To train the model on UEA datasets (https://www.timeseriesclassification.com):
python src/UEA/main.py