Skip to content

Commit

Permalink
Merge pull request #223 from VainF/v1.2
Browse files Browse the repository at this point in the history
V1.2: Improved Index Mapping
  • Loading branch information
VainF authored Jul 21, 2023
2 parents 3d7c4f8 + 12713d0 commit adecb5f
Show file tree
Hide file tree
Showing 3 changed files with 19 additions and 11 deletions.
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,14 +12,14 @@
<a href="https://pytorch.org/"><img src="https://img.shields.io/badge/PyTorch-1.8 %20%7C%201.12 %20%7C%202.0-673ab7.svg" alt="Tested PyTorch Versions"></a>
<a href="https://opensource.org/licenses/MIT"><img src="https://img.shields.io/badge/License-MIT-4caf50.svg" alt="License"></a>
<a href="https://pepy.tech/project/Torch-Pruning"><img src="https://pepy.tech/badge/Torch-Pruning?color=2196f3" alt="Downloads"></a>
<a href="https://github.com/VainF/Torch-Pruning/releases/latest"><img src="https://img.shields.io/badge/Latest%20Version-1.1.9-3f51b5.svg" alt="Latest Version"></a>
<a href="https://github.com/VainF/Torch-Pruning/releases/latest"><img src="https://img.shields.io/badge/Latest%20Version-1.2.0-3f51b5.svg" alt="Latest Version"></a>
<a href="https://colab.research.google.com/drive/1TRvELQDNj9PwM-EERWbF3IQOyxZeDepp?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
<a href="https://arxiv.org/abs/2301.12900" target="_blank"><img src="https://img.shields.io/badge/arXiv-2301.12900-009688.svg" alt="arXiv"></a>
</p>

Torch-Pruning (TP) is a library for structural pruning that enables the following features:
Torch-Pruning (TP) is a library for structural pruning with the following features:

* **General-purpose Pruning Toolkit:** TP enables structural pruning for a wide range of deep neural networks, including *[Large Language Models (LLMs)](https://github.com/horseee/LLM-Pruner), [Diffusion Models](https://github.com/VainF/Diff-Pruning), [Yolov7](examples/yolov7/), [yolov8](examples/yolov8/), [ViT](examples/torchvision_models/), FasterRCNN, SSD, ResNe(X)t, ConvNext, DenseNet, ConvNext, RegNet, DeepLab, etc*. Different from [torch.nn.utils.prune](https://pytorch.org/tutorials/intermediate/pruning_tutorial.html) that zeroizes parameters through masking, Torch-Pruning deploys a (non-deep) graph algorithm called **DepGraph** to remove parameters physically. Currently, TP is able to prune approximately **81/85=95.3%** of the models from Torchvision 0.13.1. Try this [Colab Demo](https://colab.research.google.com/drive/1TRvELQDNj9PwM-EERWbF3IQOyxZeDepp?usp=sharing) for a quick start.
* **[Performance Benchmark](benchmarks)**: Reproduce the our results in the DepGraph paper.
Expand All @@ -43,7 +43,7 @@ Please do not hesitate to open a [discussion](https://github.com/VainF/Torch-Pru

### **Features:**
- [x] Structural pruning for CNNs, Transformers, Detectors, Language Models and Diffusion Models. Please refer to the [examples](examples).
- [x] High-level pruners: [MagnitudePruner](https://arxiv.org/abs/1608.08710), [BNScalePruner](https://arxiv.org/abs/1708.06519), [GroupNormPruner](https://arxiv.org/abs/2301.12900), RandomPruner, etc.
- [x] High-level pruners: [MagnitudePruner](https://arxiv.org/abs/1608.08710), [BNScalePruner](https://arxiv.org/abs/1708.06519), [GroupNormPruner](https://arxiv.org/abs/2301.12900), [GrowingRegPruner](https://arxiv.org/abs/2012.09243), RandomPruner, etc.
- [x] Importance Criteria: L-p Norm, Taylor, Random, BNScaling, etc.
- [x] Dependency Graph for automatic structrual pruning
- [x] Supported modules: Linear, (Transposed) Conv, Normalization, PReLU, Embedding, MultiheadAttention, nn.Parameters and [customized modules](tests/test_customized_layer.py).
Expand All @@ -55,7 +55,7 @@ Please do not hesitate to open a [discussion](https://github.com/VainF/Torch-Pru
- [ ] A strong baseline with bags of tricks from existing methods.
- [ ] A benchmark for [Torchvision](https://pytorch.org/vision/stable/models.html) compatibility (**81/85=95.3%**, :heavy_check_mark:) and [timm](https://github.com/huggingface/pytorch-image-models) compatibility.
- [ ] Pruning from Scratch / at Initialization.
- [ ] More high-level pruners like [FisherPruner](https://arxiv.org/abs/2108.00708), [GrowingReg](https://arxiv.org/abs/2012.09243), etc.
- [ ] More high-level pruners like [FisherPruner](https://arxiv.org/abs/2108.00708), etc.
- [ ] More Transformers like Vision Transformers (:heavy_check_mark:), Swin Transformers, PoolFormers.
- [ ] Block/Layer/Depth Pruning
- [ ] Pruning benchmarks for CIFAR, ImageNet and COCO.
Expand Down Expand Up @@ -148,7 +148,7 @@ for group in DG.get_all_groups(ignored_layers=[model.conv1], root_module_types=[

### 2. High-level Pruners

Leveraging the DependencyGraph, we developed several high-level pruners in this repository to facilitate effortless pruning. By specifying the desired channel sparsity, the pruner will scan all prunable groups, prune the entire model, and fine-tune it using your own training code. For detailed information on this process, please refer to [this tutorial](https://github.com/VainF/Torch-Pruning/blob/master/tutorials/1%20-%20Customize%20Your%20Own%20Pruners.ipynb), which shows how to implement a [slimming](https://arxiv.org/abs/1708.06519) pruner from scratch. Additionally, a more practical example is available in [benchmarks/main.py](benchmarks/main.py).
Leveraging the DependencyGraph, we developed several high-level pruners in this repository to facilitate effortless pruning. By specifying the desired channel sparsity, the pruner will scan all prunable groups, prune the entire model, and fine-tune it using your own training code. For detailed information on this process, please refer to [this tutorial](https://github.com/VainF/Torch-Pruning/blob/master/examples/notebook/1%20-%20Customize%20Your%20Own%20Pruners.ipynb), which shows how to implement a [slimming](https://arxiv.org/abs/1708.06519) pruner from scratch. Additionally, a more practical example is available in [benchmarks/main.py](benchmarks/main.py).

```python
import torch
Expand Down
16 changes: 12 additions & 4 deletions benchmarks/readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,9 @@ A Prunability Benchmark is available at [benchmarks/prunability](prunability)
| HRank [[6]](#6) | 93.26 | 92.17 | -0.09 |2.00x |
| SFP [[7]](#7) | 93.59 | 93.36 | -0.23 |2.11x |
| ResRep [[8]](#8) | 93.71 | 93.71 | +0.00 |2.12x |
| Ours-L1 | 93.53 | 92.93 | -0.60 | 2.12x |
| Ours-BN | 93.53 | 93.29 | -0.24 | 2.12x |
| Group-L1 | 93.53 | 92.93 | -0.60 | 2.12x |
| Group-BN | 93.53 | 93.29 | -0.24 | 2.12x |
| Group-GReg | 93.53 | 93.55 | +0.02 | 2.12x |
| Ours w/o SL | 93.53 | 93.46 | -0.07 | 2.11x |
| **Ours** | 93.53 | **93.77** | +0.38 | 2.13x |
||
Expand Down Expand Up @@ -49,20 +50,27 @@ python main.py --mode pretrain --dataset cifar10 --model resnet56 --lr 0.1 --tot

### 1.2 CIFAR-10 Pruning

#### - L1-Norm Pruner
#### - L1-Norm Pruner (Group-L1)
A group-level pruner adapted from [Pruning Filters for Efficient ConvNets](https://arxiv.org/abs/1608.08710)
```bash
# 2.11x
python main.py --mode prune --model resnet56 --batch-size 128 --restore </path/to/pretrained/model> --dataset cifar10 --method l1 --speed-up 2.11 --global-pruning
```

#### - BN Pruner
#### - BN Pruner (Group-BN)
A group-level pruner adapted from [Learning Efficient Convolutional Networks through Network Slimming](https://arxiv.org/abs/1708.06519)
```bash
# 2.11x
python main.py --mode prune --model resnet56 --batch-size 128 --restore </path/to/pretrained/model> --dataset cifar10 --method slim --speed-up 2.11 --global-pruning --reg 1e-5
```

#### - Growing Regularization (Group-GReg)
A group-level pruner adapted from [Neural Pruning via Growing Regularization](https://arxiv.org/abs/2012.09243)
```bash
# 2.11x
python main.py --mode prune --model resnet56 --batch-size 128 --restore </path/to/pretrained/model> --dataset cifar10 --method growing_reg --speed-up 2.11 --global-pruning --reg 1e-4 --delta_reg 1e-5
```

#### - Group Pruner (This Work)
```bash
# 2.11x without sparse learning (Ours w/o SL)
Expand Down
4 changes: 2 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,10 @@

setuptools.setup(
name="torch-pruning",
version="v1.1.9",
version="v1.2.0",
author="Gongfan Fang",
author_email="gongfan@u.nus.edu",
description="Structural Pruning for Model Acceleration.",
description="Towards Any Structural Pruning",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/VainF/Torch-Pruning",
Expand Down

0 comments on commit adecb5f

Please sign in to comment.