An implementation of knowledge distillation using contrastive learning (CLIP loss) to transfer knowledge from a large teacher model (OpenAI's CLIP) to a smaller multilingual student model (LaBSE) on a parallel English-Persian dataset.
-
Updated
Oct 27, 2025 - Python