Program | Google Summer of Code, 2024 |
---|---|
Organisation | Sktime: A unified framework for ML with time series |
Project | Sktime integration with deep learning backends - pytorch and huggingface - Dashboard |
Mentors | Franz KirΓ‘ly - Benedikt Heidrich - Anirban Ray |
Project Length | 350 hours (Large) |
I worked with sktime as a Google Summer of Code student during the period late May to August 2024. This post is created to summarise the work Iβve done over this period as the work product submission required to be submitted at the end of GSoC.
Sktime is a library for time series analysis in Python. It provides a unified interface for multiple time series learning tasks like classification, regression, clustering, annotation, and forecasting.
My project was focused on implementing and interfacing deep learning models in sktime leveraging PyTorch and Hugging Face.
Data Science
- AI
- Time Series
- Toolbox Frameworks
- Machine Learning
- Deep Learning
Python
- GitHub
- Pytorch
- Huggingface
- Scikit-Learn
- Enhanced skills in PyTorch and Hugging Face.
- Improved Python coding practices, focusing on writing efficient, high-quality code.
- Gained experience in test-driven development and designing optimal code solutions.
- Acquired knowledge of machine learning and deep learning techniques for time-series data analysis.
- Familiarized with time-series-related libraries and packages.
- Gained insights into the life cycle, development, and maintenance of a Python package through hands-on experience with sktime.
- Enhanced experience in open-source project contributions.
- Strengthened Git and GitHub skills.
- Improved communication with mentors and collaboration on complex design decisions.
- Initially, managing time was difficult, as it was my first experience working on a project of this scale. However, I quickly adapted by breaking down tasks and prioritizing them, which improved my efficiency.
- Maintaining consistency with daily stand-ups and weekly mentoring sessions was challenging at first. Over time, I became more comfortable with agile practices, and I eventually led the weekly stand-ups, which enhanced my productivity and integration with the team.
- Understanding and modifying a large, complex codebase was tough, but I tackled this by gradually familiarizing myself with key components, which made navigating and contributing to the codebase easier.
- Implementing features required deeper knowledge of certain libraries than I initially had. I addressed this by dedicating extra time to learning these libraries, which allowed me to implement features effectively.
- Designing solutions proved more challenging than implementing them. Recognizing this, I shifted my focus to developing efficient design strategies before execution, leading to more robust solutions.
These contributions primarily involve the implementation of new algorithms and the enhancement and fixing of existing ones.
Pull Request | Status | Title | Related Issue |
---|---|---|---|
#6928 | Draft | [ENH] Global Forecast API for BaseDeepNetworkPyTorch based interfaces | #6836 |
#6842 | Open | [ENH] Implements Autoregressive Wrapper | #6802 |
#6571 | Open | [ENH] interface to TimesFM Forecaster | #6408 |
#6791 | Merged | [ENH] Pytorch Classifier & de-novo implementation of Transformer | #6786 |
#6712 | Merged | [ENH] Interface to TinyTimeMixer foundation model | #6698 |
#6202 | Merged | [ENH] de-novo implementation of LTSFTransformer based on cure-lab research code base | #4939 |
#6457 | Merged | [ENH] Extend HFTransformersForecaster for PEFT methods | #6435 |
#6321 | Merged | [BUG] fixes failing test in neuralforecast auto freq, amid pandas freq deprecations | |
#6237 | Merged | [ENH] Update doc and behavior of freq="auto" in neuralforecast | |
#6367 | Merged | [MNT] final change cycle (0.30.0) for renaming cINNForecaster to CINNForecaster | #6120 |
#6238 | Merged | [MNT] change cycle (0.29.0) for renaming cINNForecaster to CINNForecaster | #6120 |
In addition to this, these PRs were submitted during the application review period.
Pull Request | Status | Title | Related Issue |
---|---|---|---|
#6121 | Merged | [MNT] initialize change cycle (0.28.0) for renaming cINNForecaster to CINNForecaster | #6120 |
#6039 | Merged | [ENH] NeuralForecastRNN should auto-detect freq | |
#6088 | Merged | [MNT] create build tool to check invalid backticks | |
#6023 | Merged | [DOC] Fix invalid use of single-grave in docstrings | |
#6116 | Merged | [ENH] Adds MSTL import statement in detrend | #6085 |
#6059 | Merged | [ENH] Examples for YtoX transformer docstring |
Here, I will walk through some of the major contributions, from the above pull requests, where I added estimators to sktime.
To see the working and inference of these estimators, please refer to CODE.ipynb.
- Title: [ENH] PyTorch Classifier & De-Novo Implementation of Transformer
- Status: Merged
- Pull Request: #6791
- Related Issue: #6786
- Research Paper: A Transformer-based Framework for Multivariate Time Series Representation Learning
- Official Code: gzerveas/mvts_transformer
- Sktime Source Code: sktime/classification/deep_learning/mvts_transformer.py
This pull request introduces the MVTSTransformerClassifier
, based on the paper "A Transformer-based Framework for Multivariate Time Series Representation Learning," applying it to classification and regression.
I implemented the BaseDeepClassifierPytorch class as a foundation for PyTorch-based classifiers. Then, I used the TSTransformerEncoderClassiregressor to build the PyTorch network. Finally, I created the MVTSTransformerClassifier class to integrate the network with the base class.
This process enhanced my understanding of transformer architecture.
The estimator can be loaded into sktime using the following code:
from sktime.classification.deep_learning import MVTSTransformerClassifier
model = MVTSTransformerClassifier(
d_model=256,
n_heads=4,
num_layers=4,
dim_feedforward=128,
dropout=0.1,
pos_encoding="fixed",
activation="relu",
norm="BatchNorm",
freeze=False,
)
For more details on how the estimator works, please refer to CODE.ipynb.
- Title: [ENH] Interface to TinyTimeMixer Foundation Model
- Status: Merged
- Pull Request: #6712
- Related Issue: #6698
- Research Paper: Tiny Time Mixers (TTMs): Fast Pre-trained Models for Enhanced Zero/Few-Shot Forecasting of Multivariate Time Series
- Official Code: ibm-granite/granite-tsfm
- Sktime Source Code: sktime/forecasting/ttm.py
TinyTimeMixer (TTM) is a compact, pre-trained model for time-series forecasting, developed and open-sourced by IBM Research.
In this PR, I integrated TTM into the sktime framework by forking the official code into the sktime/libs/granite_ttm directory, as the source package was not available on PyPI.
Next, I developed an interface for the estimator within the TinyTimeMixerForecaster class.
Throughout this implementation, I gained valuable experience in creating custom Hugging Face models and configurations, loading and modifying weights, altering architecture, and training newly initialized weights.
The estimator can now be loaded into sktime using the following code:
from sktime.forecasting.ttm import TinyTimeMixerForecaster
model = TinyTimeMixerForecaster(
model_path="ibm/TTM",
revision="main",
validation_split=0.2,
config=None,
training_args=None,
compute_metrics=None,
callbacks=None,
broadcasting=False,
use_source_package=False,
)
For further details on how the estimator functions, please refer to CODE.ipynb.
- Title: [ENH] De-Novo Implementation of LTSFTransformer Based on Cure-Lab Research Codebase
- Status: Merged
- Pull Request: #6202
- Related Issue: #4939
- Research Paper: Are Transformers Effective for Time Series Forecasting?
- Official Code: cure-lab/LTSF-Linear
- Sktime Source Code: sktime/forecasting/ltsf.py
This pull request introduces the LTSFTransformer
, an implementation based on the paper "Are Transformers Effective for Time Series Forecasting?" which explores the application of transformer architecture to time series forecasting.
To begin the implementation, I structured the transformer architecture in the sktime/networks/ltsf/layers directory, along with the PyTorch dataset class PytorchFormerDataset.
Next, I developed the LTSFTransformerNetwork interface class by leveraging the base PyTorch forecasting class, which connects to the network created in the previous step.
Throughout this implementation, I gained valuable insights into transformer architecture, particularly in applying various embeddings and encodings to temporal features in time series data.
The estimator can be loaded into sktime with the following code:
from sktime.forecasting.ltsf import LTSFTransformerForecaster
model = LTSFTransformerForecaster(
seq_len=30,
context_len=15,
pred_len=15,
num_epochs=50,
batch_size=8,
in_channels=1,
individual=False,
criterion=None,
criterion_kwargs=None,
optimizer=None,
optimizer_kwargs=None,
lr=0.002,
position_encoding=True,
temporal_encoding=True,
temporal_encoding_type="embed", # linear, embed, fixed-embed
d_model=32,
n_heads=1,
d_ff=64,
e_layers=1,
d_layers=1,
factor=1,
dropout=0.1,
activation="relu",
freq="M",
)
For further details on how the estimator functions, please refer to CODE.ipynb.
- Title: [ENH] Interface to TimesFM Forecaster
- Status: Open
- Pull Request: #6571
- Related Issue: #6408
- Research Paper: A Decoder-Only Foundation Model for Time-Series Forecasting
- Official Code: google-research/timesfm
TimesFM (Time Series Foundation Model) is a pre-trained model developed by Google Research, designed specifically for time-series forecasting.
While integrating this model into sktime, I encountered new libraries and packages. Due to dependency conflicts with the package available on PyPI, I forked the code to sktime/libs/timesfm.
I then created an interface for the model within the TimesFMForecaster class.
Throughout this implementation, I gained hands-on experience with foundation models and explored their capabilities.
This Pull Request is still in progress, but when merged, you can load the estimator into sktime using the following code:
from sktime.forecasting.timesfm_forecaster import TimesFMForecaster
forecaster = TimesFMForecaster(
context_len=64,
horizon_len=32,
)
For more details on how the estimator functions, please refer to CODE.ipynb.
- Title: [ENH] Extend HFTransformersForecaster for PEFT Methods
- Status: Merged
- Pull Request: #6457
- Related Issue: #6435
The HFTransformersForecaster
in sktime allows users to load and fine-tune pre-trained models from Hugging Face. In this PR, I extended the HFTransformersForecaster
to support Parameter-Efficient Fine-Tuning (PEFT) methods, enabling more efficient fine-tuning of large pre-trained models using customized configurations.
Through this implementation, I gained a deeper understanding of various PEFT techniques and how they can enhance the fine-tuning process for large-scale models.
You can now load the estimator in sktime with a PEFT configuration using the following code:
from sktime.forecasting.hf_transformers_forecaster import HFTransformersForecaster
from peft import LoraConfig
forecaster = HFTransformersForecaster(
model_path="huggingface/autoformer-tourism-monthly",
fit_strategy="peft",
training_args={
"num_train_epochs": 20,
"output_dir": "test_output",
"per_device_train_batch_size": 32,
},
config={
"lags_sequence": [1, 2, 3],
"context_length": 2,
"prediction_length": 4,
"use_cpu": True,
"label_length": 2,
},
peft_config=LoraConfig(
r=8,
lora_alpha=32,
target_modules=["q_proj", "v_proj"],
lora_dropout=0.01,
)
)
For more details on how the estimator works, please refer to CODE.ipynb.
In sktime, some global forecasters require the forecasting horizon to be specified during the fitting process, limiting their ability to predict on different horizons afterward. This pull request introduces the AutoregressiveWrapper
, which wraps around these forecasters, allowing them to forecast on varying horizons while fitting on a fixed horizon that is generated internally.
During this implementation, I deepened my understanding of pandas indexes, particularly in handling multi-indexes. By the end of the process, I was able to create efficient and reliable code.
This PR is still in progress, but once merged, you can load a forecaster and apply the AutoregressiveWrapper
using the following code:
from sktime.forecasting.pytorchforecasting import PytorchForecastingNBeats
from sktime.forecasting.compose import AutoRegressiveWrapper
forecaster = PytorchForecastingNBeats(trainer_params={
"max_epochs": 20,
})
wrapper = AutoRegressiveWrapper(
forecaster=forecaster,
horizon_length=5,
aggregate_method=np.mean,
)
For more details on how the estimator works, please refer to CODE.ipynb.
- Title: [ENH] Global Forecast API for BaseDeepNetworkPyTorch-Based Interfaces
- Status: Draft
- Pull Request: #6928
- Related Issue: #6836
This PR enhances the BaseDeepNetworkPyTorch
class to support global forecasting, enabling models like CINNForecaster
and the LTSF
family to operate as global forecasters.
Although still a work in progress, once merged, these models can be loaded and trained on hierarchical data, similar to other global forecasters in the sktime framework.
Some relevant future work:
- There is a list of foundation models, expected to be integrated in sktime - #6177
- Some estimators are required to be extended for global forecasting interface - #6836
- Enhancements are expected around the pytorch adapter for forecasting - #6641
- Improvements are also planned with the global forecasting interface - #6997
- Enabling PEFT for foundation models - #6968
I had a great experience over the summer, and although the GSoC period is coming to an end, going forward I shall continue to remain a contributor to sktime. I'm incredibly thankful to both Google and sktime for giving me this opportunity, and to the welcoming community and amazing mentors at sktime for making this experience such a memorable one. There is no doubt that I am a better coder than I was 4 months ago, and I'm eagerly looking forward to learning more in the time to come.