Cross-lingual Language Model (XLM) pretraining and Model-Agnostic Meta-Learning (MAML) for fast adaptation of deep networks
-
Updated
Mar 26, 2021 - Jupyter Notebook
Cross-lingual Language Model (XLM) pretraining and Model-Agnostic Meta-Learning (MAML) for fast adaptation of deep networks
XLM implementation with utilities to process and train on large multi-lingual datasets, with not enough RAM.
Add a description, image, and links to the bpe-codes topic page so that developers can more easily learn about it.
To associate your repository with the bpe-codes topic, visit your repo's landing page and select "manage topics."