Skip to content

Latest commit

 

History

History
46 lines (35 loc) · 1.16 KB

README.md

File metadata and controls

46 lines (35 loc) · 1.16 KB

SRRL

Paper

Knowledge distillation via softmax regression representation learning

Jing Yang, Brais Martinez, Adrian Bulat, Georgios Tzimiropoulos

Method

Requirements

  • Python >= 3.6
  • PyTorch >= 1.0.1

ImageNet Training and Testing

python train_imagenet_distillation.py --net_s resnet18S --net_t resnet34T

python train_imagenet_distillation.py --net_s MobileNet --net_t resnet50T

log

https://drive.google.com/drive/folders/19OnwUad63-ITXL2TxguRdyP0KtKJIfgI

Citation

@inproceedings{yang2021knowledge, 
  title={Knowledge distillation via softmax regression representation learning},
  author={Jing Yang, Brais Martinez, Adrian Bulat, Georgios Tzimiropoulos},
  booktitle={ICLR2021},
  year={2021}  
}
@article{yang2020knowledge,
  title={Knowledge distillation via adaptive instance normalization},
  author={Yang, Jing and Martinez, Brais and Bulat, Adrian and Tzimiropoulos, Georgios},
  journal={arXiv preprint arXiv:2003.04289},
  year={2020}
}

License

This project is licensed under the MIT License