This is a Flask-based proxy server that forwards requests to the OpenAI API through a SOCKS5 proxy. It supports both regular and streaming requests and preserves request headers and authorization tokens.
- Proxies all OpenAI API requests under
/v1/*paths. - Supports GET and POST methods (including streaming via Server-Sent Events).
- Forwards client Authorization and other headers.
- Routes traffic through a configurable SOCKS5 proxy (with optional authentication).
- Adds request IDs and client IP info to logs for tracing.
- Simple setup using environment variables.
- Python 3.7+
pippackages:- Flask
- Requests with SOCKS support
- python-dotenv
Install dependencies with:
pip install flask requests[socks] python-dotenvCreate a .env file in the project root with the following configurations:
# OpenAI API base URL (default is official OpenAI API)
OPENAI_BASE_URL=https://api.openai.com/v1
# SOCKS5 proxy settings (required)
SOCKS5_PROXY_HOST=your.socks5.proxy.host
SOCKS5_PROXY_PORT=1080
# SOCKS5 proxy authentication (optional)
SOCKS5_PROXY_USERNAME=your_proxy_username
SOCKS5_PROXY_PASSWORD=your_proxy_password
# Flask server settings
FLASK_HOST=0.0.0.0
FLASK_PORT=8868
# Debug mode (set to True or False)
FLASK_DEBUG=FalseSOCKS5_PROXY_USERNAMEandSOCKS5_PROXY_PASSWORDcan be omitted if your SOCKS5 proxy does not require authentication.- Adjust
FLASK_HOSTandFLASK_PORTfor your deployment environment.
Run the proxy using:
python openai_proxy.pyFor production deployment, it is recommended to run with a WSGI server such as Gunicorn:
gunicorn -w 4 -b 0.0.0.0:8868 openai_proxy:appSend your OpenAI API requests to this proxy instead of directly to api.openai.com. For example:
Request URL:
http://localhost:8868/v1/chat/completions
Headers:
Authorization: Bearer YOUR_OPENAI_API_KEY
Content-Type: application/json
The proxy will forward the request through the configured SOCKS5 proxy and return the OpenAI API response transparently.
The proxy supports OpenAI streaming completions:
- When you include
"stream": truein the request JSON body, the proxy will stream the response back to you as Server-Sent Events (SSE). - The HTTP response will have the
Content-Type: text/event-streamheader. - Clients should read chunks progressively to see partial generation results.
- Each request is logged with a unique Request ID and the client IP address.
- Logs include info-level messages for proxied requests and warnings/errors for failures.
- Ensure your SOCKS5 proxy is reachable and properly configured.
- The script requires the
requests[socks]package and underlyingPySocks. - This proxy assumes clients provide the OpenAI
Authorizationheader. - The proxy does not currently implement rate limiting or additional authentication.
This project is provided as-is without warranty. Adapt and use at your own risk.
Feel free to open issues or contribute improvements!