Skip to content

Official Repository for the NeurIPS 2024 paper Rough Transformers: Lightweight Continuous-Time Sequence Modelling with Path Signatures

Notifications You must be signed in to change notification settings

AlvaroArroyo/RFormer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 

Repository files navigation

RFormer

Official Repository for the NeurIPS 2024 paper Rough Transformers: Lightweight and Continuous Time Series Modelling through Signature Patching.

(Note: The code will undergo some refactoring in the near future.)

Please, if you use this code, cite the published paper in the Proceedings of NeurIPS 2024:

@inproceedings{morenorough,
  title={Rough Transformers: Lightweight and Continuous Time Series Modelling through Signature Patching},
  author={Moreno-Pino, Fernando and Arroyo, Alvaro and Waldon, Harrison and Dong, Xiaowen and Cartea, Alvaro},
  booktitle={The Thirty-eighth Annual Conference on Neural Information Processing Systems}
}

Requirements

To ensure compatibility, the repository includes a rformer.yml file for creating a conda environment with all necessary dependencies.

Setting Up the Environment

To set up the environment:

conda env create -f rformer.yml
conda activate rformer

Running the Code

First, clone the repository:

git clone https://github.com/AlvaroArroyo/RFormer.git
cd RFormer

The paper includes experiments on both synthetic (src/other) and UEA datasets (src/UEA).

To train the model on synthetic data:

python src/other/main_classification_synthetic_long.py

To train the model on UEA datasets (https://www.timeseriesclassification.com):

python src/UEA/main.py

About

Official Repository for the NeurIPS 2024 paper Rough Transformers: Lightweight Continuous-Time Sequence Modelling with Path Signatures

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages