A custom Streamlit Web App to Chat with the latest LLMs and get a per use cost
instead of a fixed monthly price.
Use many large language models: OpenAI, Anthropic, Open / Local LLM's with one Streamlit Web App.
- LLM Support
- Ollama - Open Source Models
- OpenAI - GPT 3.5 / GPT4 / GPT4o / GPT4o-mini
- Anthropic - Claude 3 (Opus / Sonnet) / Claude 3.5
- Groq API - LlaMa models using quick LPU inference
- Extended explanation
- SliDev presentation of the Streamlit-MultiChat
- This blog post →
- Deploy as per
https://github.com/JAlcocerT/Streamlit-MultiChat/tree/main/Z_DeployMe
During the process, I also explored: SliDev PPTs, ScrapeGraph, DaLLe, Streamlit Auth and OpenAI as Custom Agents
The Project is documented here →
Clone the repository and Run with your API keys 👇
- OpenAI API Keys - https://platform.openai.com/api-keys
- Anthropic - https://console.anthropic.com/settings/keys
- Groq - https://console.groq.com/keys
- For Ollama, you need this setup
Try the Project quickly with Python Venv's:
- Get Python Installed
- Prepare a Venv
git clone https://github.com/JAlcocerT/Streamlit-MultiChat
#python -m venv multichat_venv #create the venv
python3 -m venv multichat_venv #linux
#multichat_venv\Scripts\activate #activate venv (windows)
source multichat_venv/bin/activate #(linux)
Then, provide the API Keys and run the Streamlit Web App:
pip install -r requirements.txt #all at once
cp ./.streamlit/secrets_sample.toml ./.streamlit/secrets.toml #fill the API Keys
streamlit run Z_multichat.py
- Make sure to have Ollama ready and running your desired model!
- Prepare the API Keys in any of:
- .streamlit/secrets.toml
- As Environment Variables
- Linux -
export OPENAI_API_KEY="YOUR_API_KEY"
- CMD -
set OPENAI_API_KEY=YOUR_API_KEY
- PS -
$env:OPENAI_API_KEY="YOUR_API_KEY"
- In the Docker-Compose
- Linux -
- Through the Streamlit UI
- Alternatively - Use the Docker Image
docker pull ghcr.io/jalcocert/streamlit-multichat:latest #x86/ARM64
Projects I got inspiration from / consolidated in this App were tested here: ./Z_Tests