Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
-
Updated
Apr 24, 2023 - Jupyter Notebook
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
A unified inference and post-training framework for accelerated video generation.
推荐/广告/搜索领域工业界经典以及最前沿论文集合。A collection of industry classics and cutting-edge papers in the field of recommendation/advertising/search.
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
The official repo for [NeurIPS'22] "ViTPose: Simple Vision Transformer Baselines for Human Pose Estimation" and [TPAMI'23] "ViTPose++: Vision Transformer for Generic Body Pose Estimation"
Pytorch implementation of various Knowledge Distillation (KD) methods.
A PyTorch-based knowledge distillation toolkit for natural language processing
PaddleSlim is an open-source library for deep model compression and architecture search.
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
LightlyTrain is the first PyTorch framework to pretrain computer vision models on unlabeled data for industrial applications
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
⚡ Flash Diffusion ⚡: Accelerating Any Conditional Diffusion Model for Few Steps Image Generation (AAAI 2025 Oral)
Segmind Distilled diffusion
A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.
🤗 Optimum Intel: Accelerate inference with Intel optimization tools
irresponsible innovation. Try now at https://chat.dev/
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."