Skip to content
/ unssm Public

Code from my master's thesis at TUM. This work is published at ICML'25 with the title UnHiPPO: Uncertainty-aware Initialization for State Space Models (https://arxiv.org/abs/2506.05065). Please see the contributions statement in the paper.

Notifications You must be signed in to change notification settings

saydemr/unssm

Repository files navigation

Uncertainty-aware State Space Models

Code from my master's thesis at TUM.

This work is published at ICML'25 with the title: UnHiPPO: Uncertainty-aware Initialization for State Space Models. Please see the contributions statement in the paper.

Installation

Install pixi to create a consistent environment. Then run pixi install.

Datasets

Informer Datasets

ETTh, ETTm, ECL and Weather datasets are available for download from here. You can also execute the following lines of code.

wget -O informer.zip --no-check-certificate -r 'https://drive.google.com/uc?export=download&id=1XqpxE6cthIxKYviSmR703yU45vdQ1oHT'
mkdir -p data
unzip informer.zip -d data

Speech Commands Dataset

Speech Commands dataset is automatically downloaded.

Mackey-Glass Dataset

Mackey-Glass sequences are automatically generated.

Training

To start a training, call train.py with the your settings, for example

./train.py data.batch_size=128

Grid Search Experiments

You can configure experiments with yaml files in config/experiment and run them with ./train.py -m where -m instructs hydra to launch multiple runs.

./train.py -m experiment=dim-sweep data.batch_size=32

Submit runs to the SLURM cluster

You can submit a training to slurm with

./train.py -m hydra/launcher=slurm hydra.launcher.partition=gpu_gtx_1080 <other overrides here>

Note that you need to pass -m even if you are only submitting a single run, so that hydra uses the slurm launcher instead of launching your run locally. You can override launcher parameters on the command line as usual or set defaults in config/hyra/launcher/slurm.yaml.

About

Code from my master's thesis at TUM. This work is published at ICML'25 with the title UnHiPPO: Uncertainty-aware Initialization for State Space Models (https://arxiv.org/abs/2506.05065). Please see the contributions statement in the paper.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published