From e6f0aff234eab11bee824603b60fd184aa7c65ed Mon Sep 17 00:00:00 2001 From: R7B7 Date: Thu, 5 Dec 2024 15:11:23 -0600 Subject: [PATCH] Update README.md Updated description, added banner image, removed code examples as they are added to wiki --- README.md | 96 +++++++++++++------------------------------------------ 1 file changed, 22 insertions(+), 74 deletions(-) diff --git a/README.md b/README.md index 4e7a3f8..0f72a9c 100644 --- a/README.md +++ b/README.md @@ -1,4 +1,6 @@ -# hosp-ai +![Banner GIF](banner.gif) + +# HOSP-AI Universal LLM Provider Connector for Java ![GitHub stars](https://img.shields.io/github/stars/r7b7/hosp-ai?style=social) @@ -7,87 +9,33 @@ Universal LLM Provider Connector for Java [![PRs Welcome](https://img.shields.io/badge/PRs-welcome-brightgreen.svg)](CONTRIBUTING.md) [![Wiki](https://img.shields.io/badge/Documentation-Wiki-blue)](https://github.com/r7b7/hosp-ai/wiki) +## Why HOSP-AI? +TBA + +## Contributions are Welcome (in need of volunteers) +1. Fork the Repo +2. Create a Branch - name it based on issue-fix, documentation, feature +3. Pull a PR -## How to Add hosp-ai as a Maven Dependency +## Installation 1. Add jitpack repository in pom file ```bash - - jitpack.io - https://jitpack.io - - + + jitpack.io + https://jitpack.io + + - -2. Add hosp-ai dependency +2. Add hosp-ai dependency (check latest version) ```bash - com.github.r7b7 - hosp-ai - v1.0.0-alpha.1 - - - -## How to Use in Code - - LLMService service = LLMServiceFactory.createService(Provider.GROQ, "", "mixtral-8x7b-32768"); - PromptEngine promptEngine = new PromptEngine(service); - CompletionResponse response = promptEngine.getResponse(query); - - **Explanation:** - 1. Create an instance of LLMService - Pass provider name along with apikey and model name - 2. Create an instance of PromptEngine and pass the service instance to it - 3. Send the Query - - **How to choose a different model:** - 1. To use a different model from same provider, just update the model name - ```bash - LLMService service = LLMServiceFactory.createService(Provider.GROQ, "", ""); - - 2. To use a different model from different provider, update all the params passed to create an instance of LLMService - ```bash - LLMService service = LLMServiceFactory.createService(, "", ""); - - 3. Rest of the code remains the same. - -## Currently Supported Providers and Platforms -1. OpenAI -2. Anthropic -3. Groq -4. Ollama - -## Features in Pipeline -1. Accept Image Input -2. Support Streaming response - -## Integration with SpringBoot -1. Add Jitpack and hosp-ai maven dependencies as mentioned in the beginning. - - Sample code: - - @GetMapping("/prompt") - public CompletionResponse getChatCompletion(@RequestParam String query){ - LLMService service = LLMServiceFactory.createService(Provider.GROQ, "", "mixtral-8x7b-32768"); - PromptEngine promptEngine = new PromptEngine(service); - CompletionResponse response = promptEngine.getResponse(query); - return response; - } - - -2. Set values of api-key in yaml or properties file -3. Use a custom WebClient - While using frameworks like Spring Boot, developers might want to leverage the strengths provided by WebClient or RestTemplate. In such scenarios, - one can override the default client implementation provided in the library by following two steps, - - 1. Create a Custom Client class and implement one of the Client interfaces, e.g. GroqClient, - - ```bash - public class CustomGroqClient implements GroqClient + com.github.r7b7 + hosp-ai + v1.0.0-alpha.2 + - 2. Set the client in ClientFactory class - ```bash - GroqClientFactory.setClient(customGroqClient); -## More Examples +## Working Examples For working examples and tutorials - visit [Wiki](https://github.com/r7b7/hosp-ai/wiki)