Skip to content

Latest commit

 

History

History
43 lines (26 loc) · 1.6 KB

README.md

File metadata and controls

43 lines (26 loc) · 1.6 KB

Class-Specific Channel Attention for Few-Shot Learning

This repository is the official implementation for Class-Specific Channel Attention for Few-Shot Learning.

train_architecture

Requirements

Pytorch 1.8.0 is used for the experiments in the paper.

All pretrained weights and extracted features for 5-way 5-shot expriments in the paper can be downloaded from the PT-MAP repository.

Create directories "./pretrained_models_features/[miniImagenet/Tiered_ImageNet/CIFAR_FS/CUB]", and place the plk file in the corresponding directory.

Training & Testing

5-way 5-shot

python main.py --dataset [miniImagenet/Tiered_ImageNet/CIFAR_FS/CUB] --meta_train_epoch [10/15/20/25]

5-way 1-shot

Work in progress...

Results

Dataset 5-Way 1-Shot 5-Way 5-Shot
miniImageNet 96.68% 99.96%
Tiered-ImageNet 96.58% 99.37%
CIFAR-FS 98.85% 99.82%
CUB 97.43% 99.09%

Acknowledgment

Channel Importance Matters in Few-Shot Image Classification

Charting the Right Manifold: Manifold Mixup for Few-shot Learning

Manifold Mixup: Better Representations by Interpolating Hidden States

Leveraging the Feature Distribution in Transfer-based Few-Shot Learning