Skip to content

Commit

Permalink
Merge pull request #14 from justinh-rahb/main
Browse files Browse the repository at this point in the history
Updates, additions to FAQ, tutorials
  • Loading branch information
tjbck authored Feb 26, 2024
2 parents cacebf6 + 0cb2dfb commit b700ca4
Show file tree
Hide file tree
Showing 5 changed files with 29 additions and 9 deletions.
8 changes: 4 additions & 4 deletions docs/faq.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,16 +5,16 @@ title: "📋 FAQ"

# 📋 Frequently Asked Questions

**Q: Why am I asked to sign up? Where are my data being sent to?**
#### **Q: Why am I asked to sign up? Where are my data being sent to?**

**A:** We require you to sign up to become the admin user for enhanced security. This ensures that if the Open WebUI is ever exposed to external access, your data remains secure. It's important to note that everything is kept local. We do not collect your data. When you sign up, all information stays within your server and never leaves your device. Your privacy and security are our top priorities, ensuring that your data remains under your control at all times.

**Q: Why can't my Docker container connect to services on the host using `localhost`?**
#### **Q: Why can't my Docker container connect to services on the host using `localhost`?**

**A:** Inside a Docker container, `localhost` refers to the container itself, not the host machine. This distinction is crucial for networking. To establish a connection from your container to services running on the host, you should use the DNS name `host.docker.internal` instead of `localhost`. This DNS name is specially recognized by Docker to facilitate such connections, effectively treating the host as a reachable entity from within the container, thus bypassing the usual `localhost` scope limitation.

**Q: How do I make my host's services accessible to Docker containers?**
#### **Q: How do I make my host's services accessible to Docker containers?**

**A:** To make services running on the host accessible to Docker containers, configure these services to listen on all network interfaces, using the IP address `0.0.0.0`, instead of `127.0.0.1` which is limited to localhost only. This configuration allows the services to accept connections from any IP address, including Docker containers. It's important to be aware of the security implications of this setup, especially when operating in environments with potential external access. Implementing appropriate security measures, such as firewalls and authentication, can help mitigate risks.
**A:** To make services running on the host accessible to Docker containers, configure these services to listen on all network interfaces, using the IP address `0.0.0.0`, instead of `127.0.0.1` which is limited to `localhost` only. This configuration allows the services to accept connections from any IP address, including Docker containers. It's important to be aware of the security implications of this setup, especially when operating in environments with potential external access. Implementing appropriate security measures, such as firewalls and authentication, can help mitigate risks.

If you have any further questions or concerns, please don't hesitate to reach out! 🛡️
6 changes: 3 additions & 3 deletions docs/tutorial/images.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,15 @@
# Image Generation

Open WebUI now supports image generation through the AUTOMATIC1111 [API](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/API). To set this up, follow these steps:
Open WebUI now supports image generation through the **AUTOMATIC1111** [API](https://github.com/AUTOMATIC1111/stable-diffusion-webui/wiki/API). To set this up, follow these steps:

## Initial Setup

1. Ensure that you have [AUTOMATIC1111](https://github.com/AUTOMATIC1111/stable-diffusion-webui) installed.
2. Launch the WebUI with additional flags to enable API access:
2. Launch AUTOMATIC1111 with additional flags to enable API access:
```
./webui.sh --api --listen
```
For Docker installations, use the `--listen` flag to allow connections outside of localhost.
For Docker installations of Open WebUI, use the `--listen` flag to allow connections outside of localhost.

## Configuring Open WebUI

Expand Down
24 changes: 22 additions & 2 deletions docs/tutorial/litellm.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,18 @@
# LiteLLM Config

LiteLLM supports a variety of APIs, both OpenAI-compatible and others. To integrate a new API model, follow these instructions:
## Initial Setup

To allow editing of your [LiteLLM]() `config.yaml` file, use `-v /path/to/litellm/config.yaml:/app/backend/data/litellm/config.yaml` to bind-bound it with your `docker run` command:

```bash
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data -v /path/to/litellm/config.yaml:/app/backend/data/litellm/config.yaml --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```

*Note: `config.yaml` does not need to exist on the host before running for the first time.*

## Configuring Open WebUI

**LiteLLM** supports a variety of APIs, both OpenAI-compatible and others. To integrate a new API model, follow these instructions:

1. Go to the Settings > Models > LiteLLM model management interface.
2. In 'Simple' mode, you will only see the option to enter a **Model**.
Expand All @@ -12,4 +24,12 @@ LiteLLM supports a variety of APIs, both OpenAI-compatible and others. To integr

4. After entering all the required information, click the '+' button to add the new model to LiteLLM.

For more information on the specific providers and advanced settings, consult the [LiteLLM Providers Documentation](https://litellm.vercel.app/docs/providers).
## Examples

*Ollama API (from inside Docker):*
![LiteLLM Config Ollama](/img/tutorial_litellm_ollama.png)

*Gemini API (MakerSuite/AI Studio):*
![LiteLLM Config Gemini](/img/tutorial_litellm_gemini.png)

Advanced configuration options not covered in the settings interface can be edited in the `config.yaml` file manually. For more information on the specific providers and advanced settings, consult the [LiteLLM Providers Documentation](https://litellm.vercel.app/docs/providers).
Binary file added static/img/tutorial_litellm_gemini.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added static/img/tutorial_litellm_ollama.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit b700ca4

Please sign in to comment.