A transformer-based approach to predicting MEG readings from EEG sensory inputs. "Deep Learning" @ MVA, 2024.
-
Updated
Feb 25, 2025 - Python
A transformer-based approach to predicting MEG readings from EEG sensory inputs. "Deep Learning" @ MVA, 2024.
Tokenization Matters: A Fair Ablation of Point-wise, and Variate-wise Transformers for Financial Time Series. (includes PatchTST, iTransformer, Crossformer, Autoformer, Fedformer, Informer, TimeNet, and non-stationarity extensions.
Add a description, image, and links to the crossformer topic page so that developers can more easily learn about it.
To associate your repository with the crossformer topic, visit your repo's landing page and select "manage topics."