A Q&A Chatbot built using Streamlit that allows users to interact with various AI models like Llama3, Llama2, and Gemma:2b for generating responses to user queries. The project integrates LangChain for prompt management and Ollama for accessing different models, providing a flexible and customizable chatbot experience.
Python, Streamlit, LangChain, Ollama (llama3, llama2, gemma:2b), Langsmith
- Multiple Model Support: Users can choose between Llama3, Llama2, and Gemma:2b based on their needs.
- Customizable Responses: Adjustable settings for temperature and max tokens to control response style and length.
- User-Friendly Interface: Simple input field and submission button for asking queries, with real-time response generation.
- Environment Configuration: Uses .env files for secure API key management and tracking settings with Langsmith.
A Q&A Chatbot application using Streamlit that enables users to interact with OpenAI models like GPT-4. It leverages LangChain's prompt management to streamline query processing and OpenAI's API for generating responses. The app offers customization options for the model's behavior through adjustable parameters.
Python, Streamlit, OpenAI API, LangChain, dotenv
- Multiple OpenAI Model Selection: Users can select models like GPT-4, GPT-4-Turbo, and GPT-4o based on their preferences.
- Customizable Output: Adjust temperature and max tokens through the interface to control response creativity and length.
- Secure API Key Input: Users can securely input their OpenAI API key through the sidebar for accessing the models.
- User-Friendly Query Input: Provides a simple input field for users to ask questions and receive real-time responses.