Skip to content

Commit

Permalink
WIP
Browse files Browse the repository at this point in the history
  • Loading branch information
johnd0e committed Feb 21, 2024
1 parent 922bc90 commit 854c569
Show file tree
Hide file tree
Showing 12 changed files with 204 additions and 0 deletions.
18 changes: 18 additions & 0 deletions .github/workflows/cf-deploy.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
name: Deploy Worker
on:
push:
branches:
- main
pull_request:
repository_dispatch:
jobs:
deploy:
runs-on: ubuntu-latest
name: Deploy
steps:
- uses: actions/checkout@v4
- name: Build & Deploy Worker
uses: cloudflare/wrangler-action@v3
with:
apiToken: ${{ secrets.CF_API_TOKEN }}
accountId: ${{ secrets.CF_ACCOUNT_ID }}
5 changes: 5 additions & 0 deletions api/about.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export function GET(request) {
return new Response(`Hello from ${process.env.VERCEL_REGION}`);
}

export { config } from "./handler.mjs";
29 changes: 29 additions & 0 deletions api/handler.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
import worker from "../src/worker.mjs";

export default worker.fetch;

export const config = {
runtime: "edge",
// Available languages and regions for Google AI Studio and Gemini API
// https://ai.google.dev/available_regions#available_regions
// https://vercel.com/docs/concepts/edge-network/regions
regions: [
//"arn1",
"bom1",
//"cdg1",
"cle1",
"cpt1",
//"dub1",
//"fra1",
"gru1",
//"hkg1",
"hnd1",
"iad1",
"icn1",
"kix1",
"pdx1",
"sfo1",
"sin1",
"syd1",
],
};
5 changes: 5 additions & 0 deletions api/hello.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export function GET(request) {
return new Response(`Hello from ${process.env.VERCEL_REGION}`);
}

export { config } from "./handler.mjs";
2 changes: 2 additions & 0 deletions netlify.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
[functions]
node_bundler = "esbuild"
5 changes: 5 additions & 0 deletions netlify/edge-functions/handler.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export const config = { path: "/edge/*" };

import worker from "../../src/worker.mjs";

export default worker.fetch;
5 changes: 5 additions & 0 deletions netlify/functions/handler.mjs
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export const config = { path: "/*" };

import worker from "../../src/worker.mjs";

export default worker.fetch;
Empty file added public/empty.html
Empty file.
22 changes: 22 additions & 0 deletions public/index.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
<p>This is demo instance of <a href="https://github.com/PublicAffairs/openai-gemini">Gemini ➜ OpenAI</a> API transforming proxy!

</br>
Running serverless on <em>Vercel</em>.

<p>You can try it with:

<pre>
curl ${origin}/v1/chat/completions \
-H "Authorization: Bearer $YOUR_GEMINI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Hello"}],
"temperature": 0.7
}'
</pre>

<p>Please deploy your own instance, for free!
</br>
<em>This way, you can keep your API key secure.</em>

104 changes: 104 additions & 0 deletions readme.MD
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
## Why

The Gemini API is [free](https://ai.google.dev/pricing "limits applied!"), but there are many tools that work exclusively with the OpenAI API.

This project provides a personal OpenAI-compatible endpoint for free.

## Serverless?

Although it runs in the cloud, it does not require server maintenance.
It can be easily deployed to various providers for free (with generous limits suitable for personal use).

> [!TIP]
> Running the proxy endpoint locally is also an option, though it's more appropriate for development use.
## How to start

You will need a personal Google [API key](https://makersuite.google.com/app/apikey).

Even if you are located outside of the [supported regions](https://ai.google.dev/available_regions#available_regions) (e.g., in Europe), it is still possible to acquire one using a VPN.

Then deploy the project to one of the providers, using the instructions below. You will need to set up an account there (and on Github).

### Deploy With Vercel

- [![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https://github.com/PublicAffairs/openai-gemini&repository-name=my-openai-gemini)
- Alternatively can be deployed with [cli](https://vercel.com/docs/cli):
`vercel deploy`
- Serve locally: `vercel dev`
- Limitations: https://vercel.com/docs/functions/limitations

### Deploy to Netlify

- [![Deploy to Netlify](https://www.netlify.com/img/deploy/button.svg)](https://app.netlify.com/start/deploy?repository=https://github.com/PublicAffairs/openai-gemini&integrationName=integrationName&integrationSlug=integrationSlug&integrationDescription=integrationDescription)
- Alternatively can be deployed with [cli](https://docs.netlify.com/cli/get-started/):
`netlify deploy`
- Serve locally: `netlify dev`
- Limitations:
- https://docs.netlify.com/functions/get-started/?fn-language=js#synchronous-function-2
- https://docs.netlify.com/edge-functions/limits/

### Deploy to Cloudflare

https://developers.cloudflare.com/workers/tutorials/deploy-button/

- [![Deploy to Cloudflare Workers](https://deploy.workers.cloudflare.com/button)](https://deploy.workers.cloudflare.com/?url=https://github.com/PublicAffairs/openai-gemini)
- Alternatively can be deployed manually pasting content of `src/worker.mjs` to https://workers.cloudflare.com/playground (see there `Deploy` button).
- Alternatively can be deployed with [cli](https://developers.cloudflare.com/workers/wrangler/):
`wrangler deploy`
- Serve locally: `wrangler dev`
- Limitations: https://developers.cloudflare.com/workers/platform/limits/#worker-limits

## How to use

> [!WARNING]
> Not all tools allows overriding the OpenAI endpoint, but many do (however these settings can sometimes be deeply hidden).
Use the endpoint address wherever you can specify it. The relevant field may be labeled as "OpenAI proxy". You might need to look under "Advanced settings" or similar sections. Or in some sonfig file.

For some command-line tools, you may need to set an env variable, _e.g._:
`set OPENAI_BASE_URL=https://my-super-proxy.vercel.app/v1`
_..or_:
`set OPENAI_API_BASE=https://my-super-proxy.vercel.app/v1`


---

## Possible further development

- [x] `chat/completions`
Currently, most of the parameters that make sense for both APIs are implemented, except for function calls.
<details>

- [x] messages
- [x] content
- [x] role
- [x] system (=>user)
- [x] user
- [x] assistant
- [ ] tool (v1beta)
- [ ] name
- [ ] tool_calls
- [x] model _(value ignored, autoselect "gemini-pro", or "-vision" for "gpt-4-vision-preview" request)_
- [ ] frequency_penalty
- [ ] logit_bias
- [ ] logprobs
- [ ] top_logprobs
- [x] max_tokens
- [x] n (candidateCount <8) _n.b.: atm api does not accept >1_
- [ ] presence_penalty
- [ ] response_format
- [ ] seed
- [x] stop: string|array (stopSequences [1,5])
- [x] stream
- [x] temperature (0.0..1.0)
- [ ] <0, >1..2
- [x] top_p
- [ ] tools (v1beta)
- [ ] tool_choice (v1beta)
- [ ] user

</details>
- [ ] `completions`
- [ ] `embeddings`
- [ ] `models`
5 changes: 5 additions & 0 deletions vercel.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
{
"rewrites": [
{ "source": "/(.*)", "destination": "api/handler" }
]
}
4 changes: 4 additions & 0 deletions wrangler.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
name = "gemini"
main = "src/worker.mjs"
compatibility_date = "2024-01-17"
compatibility_flags = [ "nodejs_compat" ]

0 comments on commit 854c569

Please sign in to comment.