Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -48,4 +48,7 @@ pids/

# Temporary files
.tmp/
temp/
temp/

# MCP todo list
.mcp-todos.json
37 changes: 37 additions & 0 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,6 +78,42 @@ export ANTHROPIC_API_KEY=your_key_here
# Then just run: mcp-use
```

## Using Local OpenAI-Compatible APIs

The CLI now supports local OpenAI-compatible APIs (like LM Studio, Ollama with OpenAI compatibility, LocalAI, etc.):

### Option 1: Using the dedicated local provider

```bash
# Set your local API endpoint (defaults to http://localhost:1234/v1)
export LOCAL_OPENAI_BASE_URL=http://localhost:1234/v1

# Use any API key (local servers often don't require real keys)
export LOCAL_OPENAI_API_KEY=local-api-key

# Select the local provider and model
mcp-use
/model localopenai gpt-3.5-turbo
```

### Option 2: Using OpenAI provider with custom base URL

```bash
# Set custom base URL for OpenAI provider
export OPENAI_BASE_URL=http://localhost:1234/v1
export OPENAI_API_KEY=your-local-key

# Use as normal OpenAI provider
mcp-use
/model openai gpt-3.5-turbo
```

Common local API servers:
- **LM Studio**: Default URL is `http://localhost:1234/v1`
- **Ollama (OpenAI mode)**: Use `http://localhost:11434/v1`
- **LocalAI**: Default URL is `http://localhost:8080/v1`
- **Text Generation WebUI**: With OpenAI extension at `http://localhost:5000/v1`

## Usage

```
Expand Down Expand Up @@ -174,6 +210,7 @@ Switch LLM providers and configure settings using slash commands:
/model google gemini-1.5-pro
/model mistral mistral-large-latest
/model groq llama-3.1-70b-versatile
/model localopenai gpt-3.5-turbo # For local APIs

# List available models
/models
Expand Down
30 changes: 28 additions & 2 deletions source/services/llm-service.ts
Original file line number Diff line number Diff line change
Expand Up @@ -28,8 +28,34 @@ const PROVIDERS = {
openai: {
envVar: 'OPENAI_API_KEY',
defaultModel: 'gpt-4o',
factory: (key: string, cfg: LLMConfig) =>
new ChatOpenAI({openAIApiKey: key, modelName: cfg.model}),
factory: (key: string, cfg: LLMConfig) => {
const baseURL = process.env['OPENAI_BASE_URL'] || process.env['OPENAI_API_BASE'];
Copy link

Copilot AI Jul 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The base URL normalization logic is duplicated between the openai and localopenai providers; consider extracting this into a shared helper to reduce duplication and improve maintainability.

Copilot uses AI. Check for mistakes.
const config: any = {
openAIApiKey: key,
modelName: cfg.model,
};
if (baseURL) {
config.configuration = {
baseURL: baseURL.endsWith('/') ? baseURL.slice(0, -1) : baseURL,
};
}
return new ChatOpenAI(config);
},
},
localopenai: {
envVar: 'LOCAL_OPENAI_API_KEY',
defaultModel: 'gpt-3.5-turbo',
factory: (key: string, cfg: LLMConfig) => {
const baseURL = process.env['LOCAL_OPENAI_BASE_URL'] || 'http://localhost:1234/v1';
const config: any = {
openAIApiKey: key || 'local-api-key',
Comment on lines +49 to +51
Copy link

Copilot AI Jul 11, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Using a hardcoded fallback API key ('local-api-key') may lead to sending misleading credentials; consider allowing an empty key or requiring an explicit local key to avoid accidental misuse.

Suggested change
const baseURL = process.env['LOCAL_OPENAI_BASE_URL'] || 'http://localhost:1234/v1';
const config: any = {
openAIApiKey: key || 'local-api-key',
if (!key) {
throw new Error("API key for 'localopenai' provider must be explicitly provided.");
}
const baseURL = process.env['LOCAL_OPENAI_BASE_URL'] || 'http://localhost:1234/v1';
const config: any = {
openAIApiKey: key,

Copilot uses AI. Check for mistakes.
modelName: cfg.model,
configuration: {
baseURL: baseURL.endsWith('/') ? baseURL.slice(0, -1) : baseURL,
},
};
return new ChatOpenAI(config);
},
},
azureopenai: {
envVar: 'AZURE_OPENAI_API_KEY',
Expand Down