This React application leverages the Ollama LLM REST API to enable users to interact with a local large language model (LLM) directly through a user-friendly chat interface. This project demonstrates how to integrate and use the Ollama REST API in a frontend app to generate AI-driven responses.
- Interactive Chat Interface: Chat with the LLM directly from your browser.
- LLM-Powered Responses: Send prompts to the LLM and receive AI-generated replies.
- API Integration: Utilizes Ollama's REST API to manage requests and responses.
- Node.js: Ensure Node.js is installed.
- Ollama: Set up the Ollama LLM REST API locally.
- LLM Model : llama3.2
- Clone the repository:
git clone https://github.com/arunpatidar02/llm-chat-app.git
cd llm-chat-app
- Install dependencies:
npm install
- Run the App:
npm start
The app will open in http://localhost:3001
.
- Open the app in your browser.
- Enter prompts in the chat interface, and the app will display responses from the LLM.
Adjust model parameters and settings through the API configuration in the source code to experiment with different response behaviors.
This project is open-source and available under the MIT License.