Skip to content

KIT-IAI/SHAPformer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SHAPformer: Explainable time-series forecasting with sampling-free SHAP for Transformers

This repository contains the code accompanying our publication on a new algorithm for estimating Shapley additive explanations (SHAP) for time-series forecasting. It introduces an explainable model, called SHAPformer, that relies on attention manipulation to evaluate feature subsets without the need for sampling from background data.

Stay tuned for the upcoming publication:

Matthias Hertel, Sebastian Pütz, Ralf Mikut, Veit Hagenmeyer, Benjamin Schäfer. Explainable time-series forecasting with sampling-free SHAP for Transformers. Under Review (2025).

A talk about SHAPformer was given at the Helmholtz AI Conference 2025. The recording is available via YouTube (duration: 11 minutes).

Documentation

This repository demonstrates the usage of the synthetic data with ground truth explanations, the model training and how to create an explanation with SHAPformer.

System requirements

The requirements are listed in the file requirements.txt.

To create a fresh virtual environment and install the dependencies in it, execute the following lines:

python -m virtualenv venv
source venv/bin/activate
pip install -r requirements.txt

The expected installation duration is a few minutes.

Tested with Python 3.11 and 3.12.

Datasets

You can download the datasets used in the paper via this link: data.zip

The zip file contains both datasets:

  • The synthetic dataset with ground truth explanations.
  • The real-world dataset containing the electrical load of TransnetBW (original source: OPSD) and weather data (original source: Copernicus).

For a demonstration how to use the synthetic dataset with ground truth, refer to this notebook: ground_truth_explanations.ipynb

Demos

The notebook training.ipynb demonstrates how to train the SHAPformer model (expected runtime on GPU: <5 minutes).

The notebook evaluate.ipynb demonstrates how to use the trained SHAPformer model to generate predictions and explanations (expected runtime on GPU: <5 minutes).

About

SHAPformer

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors