Skip to content

Commit 36b7ac1

Browse files
committed
Add auto-generated docs.
1 parent 123e990 commit 36b7ac1

File tree

8 files changed

+917
-24
lines changed

8 files changed

+917
-24
lines changed

bin/generate_truss_examples.py

Lines changed: 25 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@
1717
import subprocess
1818
import sys
1919
from pathlib import Path
20-
from typing import List, Optional, Tuple
20+
from typing import Iterator, List, Optional, Tuple
2121

2222
import yaml
2323

@@ -235,10 +235,26 @@ def _format_group_name(group_name: str) -> str:
235235
into a more human readable format for the table of contents.
236236
237237
Note that parent directory names are assumed to be in the format:
238-
* 1_introduction/...
239-
* 2_image_classification/...
238+
* 1_introduction/... (becomes "Introduction")
239+
* 2_image_classification/... (becomes "Image classification")
240+
* 3_llms/... (becomes "LLMs")
240241
"""
241-
return " ".join(group_name.split("_")[1:]).capitalize()
242+
lowercase_name = " ".join(group_name.split("_")[1:])
243+
# Capitalize the first letter. We do this rather than
244+
# use .capitalize() or .title() because we want to preserve
245+
# the case of subsequent letters
246+
return lowercase_name[0].upper() + lowercase_name[1:]
247+
248+
249+
def _toc_section(
250+
example_group_name: str, example_group: Iterator[Tuple[str, ...]]
251+
) -> dict:
252+
return {
253+
"group": _format_group_name(example_group_name),
254+
"pages": [
255+
f"examples/{example[0]}/{example[1]}" for example in list(example_group)
256+
],
257+
}
242258

243259

244260
def update_toc(example_dirs: List[str]):
@@ -269,18 +285,11 @@ def update_toc(example_dirs: List[str]):
269285
key=lambda example: example[0],
270286
)
271287

272-
# TODO: Chage this to instead of appending to pages, to instead replace the pages
273-
# before we productionize this.
274-
for example_group_name, example_group in grouped_examples:
275-
examples_section["pages"].append(
276-
{
277-
"group": _format_group_name(example_group_name),
278-
"pages": [
279-
f"examples/{example[0]}/{example[1]}"
280-
for example in list(example_group)
281-
],
282-
}
283-
)
288+
examples_section["pages"] = [
289+
_toc_section(example_group_name, example_group)
290+
for example_group_name, example_group in grouped_examples
291+
]
292+
284293
serialized_mint_config = json.dumps(mint_config, indent=2)
285294
Path(MINT_CONFIG_PATH).write_text(serialized_mint_config)
286295

Lines changed: 156 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,156 @@
1+
---
2+
title: "Getting Started: Text Classification"
3+
description: "Building your first Truss"
4+
---
5+
6+
In this example, we go through building your first Truss model. We'll be using the HuggingFace transformers
7+
library to build a text classification model that can detect sentiment of text.
8+
9+
# Step 1: Implementing the model
10+
11+
Set up imports for this model. In this example, we simply use the HuggingFace transformers library.
12+
13+
```python model/model.py
14+
from transformers import pipeline
15+
16+
```
17+
Every Truss model must implement a `Model` class. This class must have:
18+
* an `__init__` function
19+
* a `load` function
20+
* a `predict` function
21+
22+
In the `__init__` function, set up any variables that will be used in the `load` and `predict` functions.
23+
24+
```python model/model.py
25+
class Model:
26+
def __init__(self, **kwargs):
27+
self._model = None
28+
29+
```
30+
In the `load` function of the Truss, we implement logic
31+
involved in downloading the model and loading it into memory.
32+
For this Truss example, we define a HuggingFace pipeline, and choose
33+
the `text-classification` task, which uses BERT for text classification under the hood.
34+
35+
Note that the the load function runs when the
36+
37+
```python model/model.py
38+
def load(self):
39+
self._model = pipeline("text-classification")
40+
41+
```
42+
In the `predict` function of the Truss, we implement logic related
43+
to actual inference. For this example, we just call the HuggingFace pipeline
44+
that we set up in the `load` function.
45+
46+
```python model/model.py
47+
def predict(self, model_input):
48+
return self._model(model_input)
49+
```
50+
51+
# Step 2: Writing the config.yaml
52+
53+
Each Truss has a config.yaml file where we can configure
54+
options related to the deployment. It's in this file where
55+
we can define requirements, resources, and runtime options like
56+
secrets and environment variables
57+
58+
### Basic Options
59+
60+
In this section, we can define basic metadata about the model,
61+
such as the name, and the Python version to build with.
62+
63+
```yaml config.yaml
64+
model_name: bert
65+
python_version: py310
66+
model_metadata: {}
67+
68+
```
69+
### Set up python requirements
70+
71+
In this section, we define any pip requirements that
72+
we need to run the model. To run this, we need PyTorch
73+
and Tranformers.
74+
75+
```yaml config.yaml
76+
requirements:
77+
- torch==2.0.1
78+
- transformers==4.33.2
79+
80+
```
81+
### Configure the resources needed
82+
83+
In this section, we can configure resources
84+
needed to deploy this model. Here, we have no need for a GPU
85+
so we leave the accelerator section blank.
86+
87+
```yaml config.yaml
88+
resources:
89+
accelerator: null
90+
cpu: '1'
91+
memory: 2Gi
92+
use_gpu: false
93+
94+
```
95+
### Other config options
96+
97+
Truss also has provisions for adding other runtime options
98+
packages. In this example, we don't need these, so we leave
99+
this empty for now.
100+
101+
```yaml config.yaml
102+
secrets: {}
103+
system_packages: []
104+
environment_variables: {}
105+
external_package_dirs: []
106+
107+
```
108+
# Step 3: Deploying & running inference
109+
110+
Deploy the model with the following command:
111+
112+
```bash
113+
$ truss push
114+
```
115+
116+
And then you can performance inference with:
117+
```
118+
$ truss predict -d '"Truss is awesome!"'
119+
```
120+
121+
<RequestExample>
122+
```python model/model.py
123+
from transformers import pipeline
124+
125+
class Model:
126+
def __init__(self, **kwargs):
127+
self._model = None
128+
129+
def load(self):
130+
self._model = pipeline("text-classification")
131+
132+
def predict(self, model_input):
133+
return self._model(model_input)
134+
```
135+
```yaml config.yaml
136+
model_name: bert
137+
python_version: py310
138+
model_metadata: {}
139+
140+
requirements:
141+
- torch==2.0.1
142+
- transformers==4.33.2
143+
144+
resources:
145+
accelerator: null
146+
cpu: '1'
147+
memory: 2Gi
148+
use_gpu: false
149+
150+
secrets: {}
151+
system_packages: []
152+
environment_variables: {}
153+
external_package_dirs: []
154+
155+
```
156+
</RequestExample>

0 commit comments

Comments
 (0)