a Deno LLM API Service
Set base_url
to https://bunny-llm.deno.dev/v1/
or your own endpoint.
Set env variable BUNNY_API_TOKEN
and specific vendor tokens.
Model format [vendor]:[model_name]
, for example openai:gpt-4-turbo
from openai import OpenAI
client = OpenAI(
base_url='https://bunny-llm.deno.dev/v1/',
api_key='YOUR_BUNNY_API_TOKEN',
)
model_name = 'cloudflare:@cf/qwen/qwen1.5-0.5b-chat'
res = client.chat.completions.create(
model=model_name,
messages=[
{'role': 'user', 'content': 'Who are you?'}
],
)
print(res.choices[0].message.content)
vendor
is free
api_key
is EMPTY
model
is gpt-3.5-turbo
only
vendor
is cf
or cloudflare
model
can refer to Supported Models
Set env variable CF_ACCOUNT_ID
and CF_API_TOKEN
or use the following algorithm.
// JavaScript
api_key = encodeURIComponent(JSON.stringify({
'account': 'YOUR_CF_ACCOUNT_ID',
'token': 'YOUR_CF_API_TOKEN',
}))
# python
import json
import urllib.parse
api_key=urllib.parse.quote(json.dumps({
'account': 'YOUR_CF_ACCOUNT_ID',
'token': 'YOUR_CF_API_TOKEN',
}))
vendor
is ds
or dash_scope
model
can refer to Supported Models
api_key
can set by DASHSCOPE_API_KEY
or pass it directly Get API KEY
vendor
is groq
api_key
can set by GROQ_API_KEY
or pass it directly Get API KEY
model
can refer to Supported Models