Uncovering the Hidden Cost of Model Compression
Diganta Misra*,1,2,3, Agam Goyal*,4, Bharat Runwal*,1,2, Pin Yu Chen5
1 Mila - Quebec AI Institute,2 Landskape AI,3 UdeM,4 UW-Madison,5 IBM Research
* Equal Contribution
In the era of resource-intensive foundation models, efficient adaptation in downstream tasks has become paramount. Visual Prompting (VP), inspired by prompting in Large Language Models (LLMs), has emerged as a key transfer learning method in computer vision. Aligned with the growing significance of efficiency, research in model compression has become pivotal to alleviate the computational burden in both training and deploying over-parameterized neural networks. A key goal in model compression is the development of sparse models capable of matching or surpassing the performance of their over-parameterized, dense counterparts. While prior research has explored the impact of model sparsity on transfer learning, its effects on visual prompting-based transfer remain unclear. This study addresses this gap, revealing that model sparsity adversely affects the performance of visual prompting-based transfer, particularly in low-data-volume scenarios. Furthermore, our findings highlight the negative influence of sparsity on the calibration of downstream visual-prompted models. This empirical exploration calls for a nuanced understanding beyond accuracy in sparse settings, opening avenues for further research in Visual Prompting for sparse models.
Run pip3 install -r requirements.txt
.
To run the Linear Probing script to transfer the dense model onto the full CIFAR-10 dataset with default parameters, use the following command:
python3 experiments/cnn/linear_probing.py \
--network dense \
--n_shot -1 \
--batch_size 128 \
--dataset cifar10 \
--results_path results \
Note that n_shot = -1
indicated that the entire data is being used. To use other N-shot data budgets, the user can provide a custon value.
Similarly, to run the ILM-VP script to transfer the dense model onto the full CIFAR-10 dataset with default parameters, use the following command:
python3 experiments/cnn/ilm_vp.py \
--network dense \
--n_shot -1 \
--batch_size 128 \
--dataset cifar10 \
--results_path results \
To run the Linear Probing script to transfer lottery ticket at sparsity state 8
onto the full CIFAR-10 dataset with default parameters, use the following command:
python3 experiments/cnn/linear_probing.py \
--network LT \
--sparsity 8 \
--pretrained_dir pretrained_dir_name \
--n_shot -1 \
--batch_size 128 \
--dataset cifar10 \
--results_path results \
Note that n_shot = -1
indicated that the entire data is being used. To use other N-shot data budgets, the user can provide a custon value.
Similarly, to run the ILM-VP script to transfer lottery ticket at sparsity state 8
onto the full CIFAR-10 dataset with default parameters, use the following command:
python3 experiments/cnn/ilm_vp.py \
--network LT \
--sparsity 8 \
--pretrained_dir pretrained_dir_name \
--n_shot -1 \
--batch_size 128 \
--dataset cifar10 \
--results_path results \
Note: The ResNet-50 lottery ticket checkpoints pretrained on ImageNet-1k used in this study may be made available upon reasonable request.
@article{misra2023reprogramming,
title = {Reprogramming under constraints: Revisiting efficient and reliable transferability of lottery tickets},
author = {Diganta Misra and Agam Goyal and Bharat Runwal and Pin Yu Chen},
year = {2023},
journal = {arXiv preprint arXiv: 2308.14969}
}