Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,9 @@ All notable changes to this project will be documented in this file. This change

## [Unreleased]

### Added (documentation)
- Microsoft Foundry Local BYOK provider guide in `doc/auth/byok.md`: quick start example, installation instructions, and connection troubleshooting (upstream PR #461).

### Added (upstream PR #329 sync)
- Windows console window hiding: CLI process is spawned with explicit PIPE redirects ensuring the JVM sets `CREATE_NO_WINDOW` on Windows — no console window appears in GUI applications. Equivalent to upstream `windowsHide: true` (upstream PR #329).

Expand Down
47 changes: 47 additions & 0 deletions doc/auth/byok.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide
| Azure OpenAI / Azure AI Foundry | `:azure` | Azure-hosted models |
| Anthropic | `:anthropic` | Claude models |
| Ollama | `:openai` | Local models via OpenAI-compatible API |
| Microsoft Foundry Local | `:openai` | Run AI models locally on your device via OpenAI-compatible API |
| Other OpenAI-compatible | `:openai` | vLLM, LiteLLM, etc. |

## Quick Start: Azure AI Foundry
Expand Down Expand Up @@ -49,6 +50,37 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide
(println (h/query "Hello!" :session session)))
```

## Quick Start: Microsoft Foundry Local

[Microsoft Foundry Local](https://foundrylocal.ai) lets you run AI models locally on your own device with an OpenAI-compatible API. No API key is needed.

> **Note:** Foundry Local starts on a **dynamic port** — the port is not fixed. Use `foundry service status` to confirm the port the service is currently listening on, then use that port in your `:base-url`.

```clojure
;; No API key needed for local Foundry Local
;; Replace <PORT> with the port from: foundry service status
(copilot/with-client-session [session
{:model "phi-4-mini"
:provider {:provider-type :openai
:base-url "http://localhost:<PORT>/v1"}}]
(println (h/query "Hello!" :session session)))
```

To get started with Foundry Local:

```bash
# Windows: Install Foundry Local CLI (requires winget)
winget install Microsoft.FoundryLocal

# macOS / Linux: see https://foundrylocal.ai for installation instructions

# Run a model (starts the local server automatically)
foundry model run phi-4-mini

# Check the port the service is running on
foundry service status
```

## Quick Start: Anthropic

```clojure
Expand Down Expand Up @@ -187,6 +219,21 @@ However, if your Azure AI Foundry deployment provides an OpenAI-compatible endpo
:base-url "https://your-resource.openai.azure.com/openai/v1/"}}
```

### Connection Refused (Foundry Local)

Foundry Local uses a dynamic port that may change between restarts. Confirm the active port:

```bash
# Check the service status and port
foundry service status
```

Update your `:base-url` to match the port shown in the output. If the service is not running, start a model to launch it:

```bash
foundry model run phi-4-mini
```

### Connection Refused (Ollama)

Ensure Ollama is running and accessible:
Expand Down