From 6d44ac34c15bac3b0fd56304ac828d7d5e818fd9 Mon Sep 17 00:00:00 2001 From: GitHub Copilot Date: Mon, 23 Feb 2026 14:06:33 +0000 Subject: [PATCH] docs: add Microsoft Foundry Local BYOK provider guide (upstream PR #461) Port documentation for Microsoft Foundry Local as a BYOK provider. Foundry Local uses an OpenAI-compatible API on a dynamic local port, requiring no API key. Adds quick start example, installation steps, and a troubleshooting section for dynamic-port connection issues. Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com> --- CHANGELOG.md | 3 +++ doc/auth/byok.md | 47 +++++++++++++++++++++++++++++++++++++++++++++++ 2 files changed, 50 insertions(+) diff --git a/CHANGELOG.md b/CHANGELOG.md index 31391e6..f41aa57 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -3,6 +3,9 @@ All notable changes to this project will be documented in this file. This change ## [Unreleased] +### Added (documentation) +- Microsoft Foundry Local BYOK provider guide in `doc/auth/byok.md`: quick start example, installation instructions, and connection troubleshooting (upstream PR #461). + ### Added (upstream PR #329 sync) - Windows console window hiding: CLI process is spawned with explicit PIPE redirects ensuring the JVM sets `CREATE_NO_WINDOW` on Windows — no console window appears in GUI applications. Equivalent to upstream `windowsHide: true` (upstream PR #329). diff --git a/doc/auth/byok.md b/doc/auth/byok.md index b8d9ccf..6f68c21 100644 --- a/doc/auth/byok.md +++ b/doc/auth/byok.md @@ -10,6 +10,7 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide | Azure OpenAI / Azure AI Foundry | `:azure` | Azure-hosted models | | Anthropic | `:anthropic` | Claude models | | Ollama | `:openai` | Local models via OpenAI-compatible API | +| Microsoft Foundry Local | `:openai` | Run AI models locally on your device via OpenAI-compatible API | | Other OpenAI-compatible | `:openai` | vLLM, LiteLLM, etc. | ## Quick Start: Azure AI Foundry @@ -49,6 +50,37 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide (println (h/query "Hello!" :session session))) ``` +## Quick Start: Microsoft Foundry Local + +[Microsoft Foundry Local](https://foundrylocal.ai) lets you run AI models locally on your own device with an OpenAI-compatible API. No API key is needed. + +> **Note:** Foundry Local starts on a **dynamic port** — the port is not fixed. Use `foundry service status` to confirm the port the service is currently listening on, then use that port in your `:base-url`. + +```clojure +;; No API key needed for local Foundry Local +;; Replace with the port from: foundry service status +(copilot/with-client-session [session + {:model "phi-4-mini" + :provider {:provider-type :openai + :base-url "http://localhost:/v1"}}] + (println (h/query "Hello!" :session session))) +``` + +To get started with Foundry Local: + +```bash +# Windows: Install Foundry Local CLI (requires winget) +winget install Microsoft.FoundryLocal + +# macOS / Linux: see https://foundrylocal.ai for installation instructions + +# Run a model (starts the local server automatically) +foundry model run phi-4-mini + +# Check the port the service is running on +foundry service status +``` + ## Quick Start: Anthropic ```clojure @@ -187,6 +219,21 @@ However, if your Azure AI Foundry deployment provides an OpenAI-compatible endpo :base-url "https://your-resource.openai.azure.com/openai/v1/"}} ``` +### Connection Refused (Foundry Local) + +Foundry Local uses a dynamic port that may change between restarts. Confirm the active port: + +```bash +# Check the service status and port +foundry service status +``` + +Update your `:base-url` to match the port shown in the output. If the service is not running, start a model to launch it: + +```bash +foundry model run phi-4-mini +``` + ### Connection Refused (Ollama) Ensure Ollama is running and accessible: