Skip to content

PyTorch implementation of the thesis "Efficient Hand Gesture Recognition using Multi-Task Multi-Modal Learning and Self-Distillation"

License

Notifications You must be signed in to change notification settings

peter0512lee/Efficient-Hand-Gesture-Recognition-using-Multi-Task-Multi-Modal-Learning-and-Self-Distillation

Repository files navigation

Thesis

PyTorch implementation of the thesis "Efficient Hand Gesture Recognition using Multi-Task Multi-Modal Learning and Self-Distillation". (MMAsia 2023)

Overall Framework

Framework

Preparation

  1. run the construct_annot()in dataset_EgoGesture.py to generate the dataset.
  2. run the construct_annot() in dataset_NvGesture.py to generate the dataset.

Training

Stage 1: Multi-Task Multi-Modal Learning (MTMM)

MTMM

See the sh/train_ego.sh for more details.

python3 train_mtmm.py --notes MTMM ...

Stage 2: Self-Distillation (SD)

SD

See the sh/train_ego.sh for more details.

python3 train_sd.py --notes SD using MTMM weight ...

Testing

See the sh/test_ego.sh for more details.

python3 test.py --note 0.01G_Predict_Next_Segment ...

Citation

If you find this code useful, please consider citing our paper:

@inproceedings{li2023efficient,
	title={Efficient Hand Gesture Recognition using Multi-Task Multi-Modal Learning and Self-Distillation},
	author={Li, Jie-Ying and Prawiro, Herman and Chiang, Chia-Chen and Chang, Hsin-Yu and Pan, Tse-Yu and Huang, Chih-Tsun and Hu, Min-Chun},
	booktitle={ACM Multimedia Asia 2023},
	pages={1--7},
	year={2023}
}

About

PyTorch implementation of the thesis "Efficient Hand Gesture Recognition using Multi-Task Multi-Modal Learning and Self-Distillation"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published