🧠 Singular values-driven automated filter pruning
-
Updated
Nov 10, 2024 - JavaScript
🧠 Singular values-driven automated filter pruning
Make Structured Pruning Methods Smooth and Adaptive: Decay Pruning Method (DPM) is a novel smooth and dynamic pruning approach, that can be seemingly integrated with various existing structured pruning methods, providing significant improvement.
[CVPR 2023] DepGraph: Towards Any Structural Pruning
Robustness-Reinforced Knowledge Distillation with Correlation Distance and Network Pruning, IEEE Transactions on Knowledge and Data Engineering 2024
Collection of recent methods on (deep) neural network compression and acceleration.
A simple and effective LLM pruning approach.
[TPAMI 2023, NeurIPS 2020] Code release for "Deep Multimodal Fusion by Channel Exchanging"
[NeurIPS 2023] Structural Pruning for Diffusion Models
💍 Efficient tensor decomposition-based filter pruning
[TPAMI 2024] This is the official repository for our paper: ''Pruning Self-attentions into Convolutional Layers in Single Path''.
Reimplementation of Sparse Variational Dropout in Keras-Core/Keras 3.0
This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" by Jonathan Frankle and Michael Carbin that can be easily adapted to any model/dataset.
[ICLR'23] Trainability Preserving Neural Pruning (PyTorch)
In this repository using the sparse training, group channel pruning and knowledge distilling for YOLOV4,
[T-PAMI'23] PAGCP for the compression of YOLOv5
[Preprint] Why is the State of Neural Network Pruning so Confusing? On the Fairness, Comparison Setup, and Trainability in Network Pruning
Counting currency from video using RepNet as a base model.
The official code for our ACCV2022 poster paper: Network Pruning via Feature Shift Minimization.
Code for the project "SNIP: Single-Shot Network Pruning"
Add a description, image, and links to the network-pruning topic page so that developers can more easily learn about it.
To associate your repository with the network-pruning topic, visit your repo's landing page and select "manage topics."