Unified Training of Universal Time Series Forecasting Transformers
-
Updated
Jan 29, 2026 - Jupyter Notebook
Unified Training of Universal Time Series Forecasting Transformers
Collection of awesome parameter-efficient fine-tuning resources.
🎉 PILOT: A Pre-trained Model-Based Continual Learning Toolbox
About model release for "Timer-XL: Long-Context Transformers for Unified Time Series Forecasting"
Small Language Model Inference, Fine-Tuning and Observability. No GPU, no labeled data needed.
[CVPR 2024] The code repository for "Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning" in PyTorch.
Vietnamese Legal Question Answering with Machine Reading Comprehension (MRC) and Answer Generation (AG) approches. (KSE 2024)
Code for the ICML 2025 paper: "Adjustment for Confounding using Pre-Trained Representations"
Compare image similarity using features extracted from the pre-trained VGG16 model. This project leverages cosine similarity for accurate visual similarity assessment, making it ideal for image retrieval and duplicate detection.
We built this project during a hackathon to detect emergency vehicles in real-time. Although we had to stop development after reaching the third level, the system still works well for identifying emergency vehicles
A semester project using ShuffleNet, MobileNetV3 Small & ResNet50 to classify real and fake faces with the specified dataset that taken from Kaggle.
Streamlit app that predicts if a painting is a van Gogh
CNN-RNN image captioning system using TensorFlow/Keras with VGG16 feature extraction and LSTM decoder. Interactive Streamlit web app for real-time caption generation from uploaded images, trained on Flickr8k dataset with BLEU score evaluation.
Using deep learning models to accurately classify pet images into different breeds and types, demonstrating effective image classification and model evaluation.
This project demonstrates how to fine-tune a pre-trained ResNet18 model using PyTorch for binary classification. This model is adapted to identify images in two classes Positive and Negative.
Testing methods in PTM enabled OSS
dis-cyril is an Alexa like using pre-trained models and buzin.
Registration of pre-trained models found on Hugging Face using blockchain technology
Fine-tuning GPT-2 for text generation | Task 1 | Prodigy Infotech Internship
Welcome to a diverse collection of hands-on ML, AI, Python, and Data Science projects. Explore and learn through practical applications. Happy coding! 🚀
Add a description, image, and links to the pre-trained-models topic page so that developers can more easily learn about it.
To associate your repository with the pre-trained-models topic, visit your repo's landing page and select "manage topics."