diff --git a/docs/docs/reference/Model Providers/ollama.md b/docs/docs/reference/Model Providers/ollama.md index f2018e68b0..664f77a8ff 100644 --- a/docs/docs/reference/Model Providers/ollama.md +++ b/docs/docs/reference/Model Providers/ollama.md @@ -9,12 +9,15 @@ "title": "Ollama", "provider": "ollama", "model": "llama2-7b", - "completionOptions": {} + "completionOptions": {}, + "apiBase": "http://localhost:11434" } ] } ``` +If you'd like to host Ollama on another machine, you can set it up as described in the [Ollama FAQ](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network), and then set `"apiBase"` to match the IP address / port of that machine. + ## Completion Options In addition to the model type, you can also configure some of the parameters that Ollama uses to run the model.