You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The SDK supports custom OpenAI-compatible API providers (BYOK - Bring Your Own Key), including local providers like Ollama. When using a custom provider, you must specify the `Model` explicitly.
434
+
435
+
**ProviderConfig fields:**
436
+
437
+
-`Type` (string): Provider type - `"openai"`, `"azure"`, or `"anthropic"` (default: `"openai"`)
438
+
-`BaseUrl` (string): API endpoint URL (required)
439
+
-`ApiKey` (string): API key (optional for local providers like Ollama)
440
+
-`BearerToken` (string): Bearer token for authentication (takes precedence over `ApiKey`)
441
+
-`WireApi` (string): API format for OpenAI/Azure - `"completions"` or `"responses"` (default: `"completions"`)
442
+
-`Azure` (AzureProviderOptions): Azure-specific options with `ApiVersion` (default: `"2024-10-21"`)
> - When using a custom provider, the `Model` parameter is **required**. The SDK will throw an error if no model is specified.
497
+
> - For Azure OpenAI endpoints (`*.openai.azure.com`), you **must** use `Type = "azure"`, not `Type = "openai"`.
498
+
> - The `BaseUrl` should be just the host (e.g., `https://my-resource.openai.azure.com`). Do **not** include `/openai/v1` in the URL - the SDK handles path construction automatically.
0 commit comments