-
Notifications
You must be signed in to change notification settings - Fork 7
Add support for local LLMs #73
Copy link
Copy link
Open
Description
Users are requesting support for locally ran LLMs. Currently, the setup involves:
- Changing the
hostsfile for DNS resolution:
127.0.0.1 api.anthropic.com- Creating a CA certificate:
On Windows:
- Download release https://github.com/FiloSottile/mkcert
mkcert -install
mkcert localhost 127.0.0.1 api.anthropic.comOn WSL:
mkdir /etc/nginx/ssl
Copy cert and key from Windows to /etc/nginx/ssl/cert.pem and /etc/nginx/ssl/key.pem- Creating a web server:
apt update && apt install nginx- Creating the configuration file (
/etc/nginx/sites-enabled/proxy):
server {
listen 443 ssl;
ssl_certificate /etc/nginx/ssl/cert.pem;
ssl_certificate_key /etc/nginx/ssl/key.pem;
location / {
proxy_pass http://127.0.0.1:11434;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# Grands timeouts
proxy_connect_timeout 300s;
proxy_send_timeout 300s;
proxy_read_timeout 300s;
# Pour streaming Ollama
proxy_buffering off;
proxy_http_version 1.1;
proxy_set_header Connection "";
chunked_transfer_encoding on;
}
}
# Start service
service nginx start- Creating a Modelfile on Windows:
FROM gpt-oss:20b- Launching with:
ollama create claude-sonnet-4-20250514 -f ModelfileOther users have inquired about local setup/using LM Studio.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels