Skip to content

Add support for local LLMs #73

@ninjeeter

Description

@ninjeeter

Users are requesting support for locally ran LLMs. Currently, the setup involves:

  1. Changing the hosts file for DNS resolution:
127.0.0.1 api.anthropic.com
  1. Creating a CA certificate:

On Windows:

mkcert -install
mkcert localhost 127.0.0.1 api.anthropic.com

On WSL:

mkdir /etc/nginx/ssl
Copy cert and key from Windows to /etc/nginx/ssl/cert.pem and /etc/nginx/ssl/key.pem
  1. Creating a web server:
apt update && apt install nginx
  1. Creating the configuration file (/etc/nginx/sites-enabled/proxy):
server {
    listen 443 ssl;
    ssl_certificate /etc/nginx/ssl/cert.pem;
    ssl_certificate_key /etc/nginx/ssl/key.pem;

    location / {
    proxy_pass http://127.0.0.1:11434;
    proxy_set_header Host $host;
    proxy_set_header X-Real-IP $remote_addr;
    proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
    proxy_set_header X-Forwarded-Proto $scheme;
    
    # Grands timeouts
    proxy_connect_timeout 300s;
    proxy_send_timeout 300s;
    proxy_read_timeout 300s;
    
    # Pour streaming Ollama
    proxy_buffering off;
    proxy_http_version 1.1;
    proxy_set_header Connection "";
    chunked_transfer_encoding on;
    }

}

# Start service 
service nginx start
  1. Creating a Modelfile on Windows:
FROM gpt-oss:20b
  1. Launching with:
ollama create claude-sonnet-4-20250514 -f Modelfile

Other users have inquired about local setup/using LM Studio.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions