Change the repository type filter
All
Repositories list
87 repositories
Time-Series-Library
PublicA Library for Advanced Deep Time Series Models.OpenLTM
PublicOpen-Source Implementations of Large Time-Series ModelsRoPINN
PublicAbout Code release for “RoPINN: Region Optimized Physics-Informed Neural Networks” (NeurIPS 2024), https://arxiv.org/abs/2405.14369iTransformer
PublicOfficial implementation for "iTransformer: Inverted Transformers Are Effective for Time Series Forecasting" (ICLR 2024 Spotlight), https://openreview.net/forum?id=JePfAI8fah- 2024 up-to-date list of DATASETS, CODEBASES and PAPERS on Multi-Task Learning (MTL), from Machine Learning perspective.
Large-Time-Series-Model
PublicOfficial code, datasets and checkpoints for "Timer: Generative Pre-trained Transformers Are Large Time Series Models" (ICML 2024)AutoTimes
PublicOfficial implementation for "AutoTimes: Autoregressive Time Series Forecasters via Large Language Models"depyf
PubliciVideoGPT
PublicOfficial repository for "iVideoGPT: Interactive VideoGPTs are Scalable World Models" (NeurIPS 2024), https://arxiv.org/abs/2405.15223DeepLag
PublicAbout Code release for “DeepLag: Discovering Deep Lagrangian Dynamics for Intuitive Fluid Prediction” (NeurIPS 2024), https://arxiv.org/abs/2402.02425Diffusion-Tuning
PublicBTTackler
PublicTimeXer
PublicTransolver
PublicAbout code release of "Transolver: A Fast Transformer Solver for PDEs on General Geometries", ICML 2024 Spotlight. https://arxiv.org/abs/2402.02366ContextWM
PublicCode release for "Pre-training Contextualized World Models with In-the-wild Videos for Reinforcement Learning" (NeurIPS 2023), https://arxiv.org/abs/2305.18499timer
Public- Code release for "Non-stationary Transformers: Exploring the Stationarity in Time Series Forecasting" (NeurIPS 2022), https://arxiv.org/abs/2205.14415
Multi-Embedding
PublicMobileAttention
PublicOfficial implementation of "Mobile Attention: Mobile-Friendly Linear-Attention for Vision Transformers in PyTorch". To run the code, you can refer to https://github.com/thuml/Flowformer.HelmFluid
PublicAbout code release of "HelmFluid: Learning Helmholtz Dynamics for Interpretable Fluid Prediction", ICML 2024. https://arxiv.org/pdf/2310.10565Flowformer
PublicAbout Code release for "Flowformer: Linearizing Transformers with Conservation Flows" (ICML 2022), https://arxiv.org/pdf/2202.06258.pdfKoopa
PublicCode release for "Koopa: Learning Non-stationary Time Series Dynamics with Koopman Predictors" (NeurIPS 2023), https://arxiv.org/abs/2305.18803HarmonyDream
PublicCode release for "HarmonyDream: Task Harmonization Inside World Models" (ICML 2024), https://arxiv.org/abs/2310.00344TimeSiam
Public- Transfer Learning Library for Domain Adaptation, Task Adaptation, and Domain Generalization
SimMTM
PublicAbout Code release for "SimMTM: A Simple Pre-Training Framework for Masked Time-Series Modeling" (NeurIPS 2023 Spotlight), https://arxiv.org/abs/2302.00861TimesNet
PublicAbout Code release for "TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis" (ICLR 2023), https://openreview.net/pdf?id=ju_Uqw384OqLatent-Spectral-Models
PublicAbout Code Release for "Solving High-Dimensional PDEs with Latent Spectral Models" (ICML 2023), https://arxiv.org/abs/2301.12664Autoformer
PublicAbout Code release for "Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting" (NeurIPS 2021), https://arxiv.org/abs/2106.13008ForkMerge
PublicCode release of paper "ForkMerge: Mitigating Negative Transfer in Auxiliary-Task Learning" (NeurIPS 2023)