Skip to content

Commit

Permalink
Merge pull request #21 from justinh-rahb/main
Browse files Browse the repository at this point in the history
Some cleanup
  • Loading branch information
tjbck authored Mar 14, 2024
2 parents 3f65f19 + 1a37da4 commit cadf46e
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 6 deletions.
2 changes: 1 addition & 1 deletion docs/tutorial/images.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Open WebUI now supports image generation through the **AUTOMATIC1111** [API](htt

## Configuring Open WebUI

1. In Open WebUI, navigate to Settings > Images.
1. In Open WebUI, navigate to **Settings > Images**.
2. In the API URL field, enter the address where AUTOMATIC1111's API is accessible, following this format:
```
http://<your_automatic1111_address>:7860
Expand Down
10 changes: 5 additions & 5 deletions docs/tutorial/litellm.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,10 @@
# LiteLLM Config
# LiteLLM Configuration

[LiteLLM](https://litellm.vercel.app/docs/proxy/configs#quick-start) supports a variety of APIs, both OpenAI-compatible and others. To integrate a new API model, follow these instructions:

## Initial Setup

To allow editing of your [LiteLLM](https://litellm.vercel.app/docs/proxy/configs#quick-start) `config.yaml` file, use `-v /path/to/litellm/config.yaml:/app/backend/data/litellm/config.yaml` to bind-mount it with your `docker run` command:
To allow editing of your `config.yaml` file, use `-v /path/to/litellm/config.yaml:/app/backend/data/litellm/config.yaml` to bind-mount it with your `docker run` command:

```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data -v /path/to/litellm/config.yaml:/app/backend/data/litellm/config.yaml --name open-webui --restart always ghcr.io/open-webui/open-webui:main
Expand All @@ -12,9 +14,7 @@ _Note: `config.yaml` does not need to exist on the host before running for the f

## Configuring Open WebUI

**LiteLLM** supports a variety of APIs, both OpenAI-compatible and others. To integrate a new API model, follow these instructions:

1. Go to the Settings > Models > LiteLLM model management interface.
1. Go to the **Settings > Models > Manage LiteLLM Models**.
2. In 'Simple' mode, you will only see the option to enter a **Model**.
3. For additional configuration options, click on the 'Simple' toggle to switch to 'Advanced' mode. Here you can enter:

Expand Down

0 comments on commit cadf46e

Please sign in to comment.