-
Notifications
You must be signed in to change notification settings - Fork 343
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
4 changed files
with
106 additions
and
2 deletions.
There are no files selected for viewing
98 changes: 98 additions & 0 deletions
98
apps/opik-documentation/documentation/docs/production/gateway.mdx
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,98 @@ | ||
--- | ||
sidebar_label: LLM Gateway | ||
--- | ||
|
||
import Tabs from "@theme/Tabs"; | ||
import TabItem from "@theme/TabItem"; | ||
|
||
# LLM Gateway | ||
|
||
An LLM gateway is a proxy server that forwards requests to an LLM API and returns the response. This is useful for when you want to centralize the access to LLM providers or when you want to be able to query multiple LLM providers from a single endpoint using a consistent request and response format. | ||
|
||
The Opik platform includes a light-weight LLM gateway that can be used for **development and testing purposes**. If you are looking for an LLM gateway that is production ready, we recommend looking at the [Kong AI Gateway](https://docs.konghq.com/gateway/latest/ai-gateway/). | ||
|
||
## The Opik LLM Gateway | ||
|
||
The Opik LLM gateway is a light-weight proxy server that can be used to query different LLM API using the OpenAI format. | ||
|
||
In order to use the Opik LLM gateway, you will first need to configure your LLM provider credentials in the Opik UI. Once this is done, you can use the Opik gateway to query your LLM provider: | ||
|
||
<Tabs> | ||
<TabItem value="Opik Cloud" title="Opik Cloud"> | ||
```bash | ||
curl -L 'https://www.comet.com/opik/api/v1/private/chat/completions' \ | ||
-H 'Content-Type: application/json' \ | ||
-H 'Accept: text/event-stream' \ | ||
-H 'Comet-Workspace: <OPIK_WORKSPACE>' \ | ||
-H 'authorization: <OPIK_API_KEY>' \ | ||
-d '{ | ||
"model": "<LLM_MODEL>", | ||
"messages": [ | ||
{ | ||
"role": "user", | ||
"content": "What is Opik ?" | ||
} | ||
], | ||
"temperature": 1, | ||
"stream": false, | ||
"max_tokens": 10000 | ||
}' | ||
``` | ||
</TabItem> | ||
<TabItem value="Opik self-hosted" title="Opik self-hosted"> | ||
```bash | ||
curl -L 'http://localhost:5173/api/v1/private/chat/completions' \ | ||
-H 'Content-Type: application/json' \ | ||
-H 'Accept: text/event-stream' \ | ||
-d '{ | ||
"model": "<LLM_MODEL>", | ||
"messages": [ | ||
{ | ||
"role": "user", | ||
"content": "What is Opik ?" | ||
} | ||
], | ||
"temperature": 1, | ||
"stream": false, | ||
"max_tokens": 10000 | ||
}' | ||
``` | ||
</TabItem> | ||
</Tabs> | ||
|
||
:::warning | ||
The Opik LLM gateway is currently in beta and is subject to change. We recommend using the Kong AI gateway for production applications. | ||
::: | ||
|
||
## Kong AI Gateway | ||
|
||
[Kong](https://docs.konghq.com/gateway/latest/) is a popular Open-Source API gatewy that has recently released an AI Gateway. | ||
If you are looking for an LLM gateway that is production ready and supports many of the expected enterprise use cases | ||
(authentication mechanisms, load balancing, caching, etc), this is the gateway we recommend. | ||
|
||
You can learn more about the Kong AI Gateway [here](https://docs.konghq.com/gateway/latest/ai-gateway/). | ||
|
||
We have developed a Kong plugin that allows you to log all the LLM calls from your Kong server to the Opik platform. | ||
The plugin is open source and available at [comet-ml/opik-kong-plugin](https://github.com/comet-ml/opik-kong-plugin). | ||
|
||
Once the plugin is installed, you can enable it by running: | ||
|
||
```bash | ||
curl -is -X POST http://localhost:8001/services/{serviceName|Id}/plugins \ | ||
--header "accept: application/json" \ | ||
--header "Content-Type: application/json" \ | ||
--data ' | ||
{ | ||
"name": "opik-log", | ||
"config": { | ||
"opik_api_key": "<Replace with your Opik API key>", | ||
"opik_workspace": "<Replace with your Opik workspace>" | ||
} | ||
}' | ||
``` | ||
|
||
You can find more information about the Opik Kong plugin the [`opik-kong-plugin` repository](https://github.com/comet-ml/opik-kong-plugin). | ||
|
||
Once configured, you will be able to view all your LLM calls in the Opik dashboard: | ||
|
||
 |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Binary file added
BIN
+589 KB
apps/opik-documentation/documentation/static/img/production/opik-kong-gateway.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.