You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This discussion was converted from issue #3127 on June 09, 2023 11:39.
Heading
Bold
Italic
Quote
Code
Link
Numbered list
Unordered list
Task list
Attach files
Mention
Reference
Menu
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I am following the instruction in ./inference/README.MD
I use the instructions to build and then launch the inference stack.
I changed the model in docker-compose.yaml like so:
inference-worker:
build:
dockerfile: docker/inference/Dockerfile.worker-full
context: .
image: oasst-inference-worker:dev
environment:
API_KEY: "0000"
#MODEL_CONFIG_NAME: ${MODEL_CONFIG_NAME:-distilgpt2}
MODEL_CONFIG_NAME: "OA_SFT_Pythia_12Bq"
When I run
python " __main__.py" in the text-***** folder
I seem to get an infinite loop about the message pending, after I enter my question.
What am I missing?
I have around 330MiB on the host.
Beta Was this translation helpful? Give feedback.
All reactions