Skip to content

Commit

Permalink
Add docs for offline loading (#36)
Browse files Browse the repository at this point in the history
* Update README.md

* Update index.md

* Update README.md

* Update index.md
  • Loading branch information
Xing Han Lu authored Apr 10, 2021
1 parent a6f7410 commit 45f8d86
Show file tree
Hide file tree
Showing 2 changed files with 65 additions and 0 deletions.
33 changes: 33 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -113,6 +113,39 @@ print(dlt.utils.available_codes('mbart50')) # Code corresponding to each langua
print(dlt.utils.get_lang_code_map('mbart50')) # Dictionary of lang -> code
```

### Offline usage

Unlike the Google translate or MSFT Translator APIs, this library can be fully used offline. However, you will need to first download the packages and models, and move them to your offline environment to be installed and loaded inside a venv.

First, run in your terminal:
```bash
mkdir dlt
cd dlt
mkdir libraries
pip download -d libraries/ dl-translate
```

Once all the required packages are downloaded, you will need to use huggingface hub to download the files. Install it with `pip install huggingface-hub`. Then, run inside Python:
```python
import os
import huggingface_hub as hub

dirname = hub.snapshot_download("facebook/m2m100_418M")
os.rename(dirname, "cached_model_m2m100")
```

Now, move everything in the `dlt` directory to your offline environment. Create a virtual environment and run the following in terminal:
```bash
pip install --no-index --find-links libraries/ dl-translate
```

Now, run inside Python:
```python
import dl_translate as dlt

mt = dlt.TranslationModel("cached_model_m2m100", model_family="m2m100")
```


## Advanced

Expand Down
32 changes: 32 additions & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -144,6 +144,38 @@ At the moment, the following models are accepted:
- `"mbart50"`
- `"m2m100"`

### Offline usage

Unlike the Google translate or MSFT Translator APIs, this library can be fully used offline. However, you will need to first download the packages and models, and move them to your offline environment to be installed and loaded inside a venv.

First, run in your terminal:
```bash
mkdir dlt
cd dlt
mkdir libraries
pip download -d libraries/ dl-translate
```

Once all the required packages are downloaded, you will need to use huggingface hub to download the files. Install it with `pip install huggingface-hub`. Then, run inside Python:
```python
import os
import huggingface_hub as hub

dirname = hub.snapshot_download("facebook/m2m100_418M")
os.rename(dirname, "cached_model_m2m100")
```

Now, move everything in the `dlt` directory to your offline environment. Create a virtual environment and run the following in terminal:
```bash
pip install --no-index --find-links libraries/ dl-translate
```

Now, run inside Python:
```python
import dl_translate as dlt

mt = dlt.TranslationModel("cached_model_m2m100", model_family="m2m100")
```

## Advanced

Expand Down

0 comments on commit 45f8d86

Please sign in to comment.