🐬DeepChat - A smart assistant that connects powerful AI to your personal world
-
Updated
Oct 6, 2025 - TypeScript
🐬DeepChat - A smart assistant that connects powerful AI to your personal world
🌊 AChat - An open-source/self-hosted/local-first AI platform, designed for enterprises and teams, perfectly combining powerful local processing capabilities with seamless remote synchronization.
a magical LLM desktop client that makes it easy for *anyone* to use LLMs and MCP
ai-tools call your llm based tools through shortcut (ctrl-q) in any application
✨ A lightweight desktop AI assistant that lives in your system tray with quick global hotkey access.
A Local-first LLM Interface — desktop-optimized and built for power users running Ollama and beyond.
FlexiProxy is a service proxy that provides OpenAI-Compatible API compatibility for different target supplier platforms. It allows users to use different backend services under existing LLM clients, solving the problem of expensive or unavailable large language model backend services in certain regions for clients that are easy to use.
Just a library to store AI clients and API key for other applications
pure c# wpf implement for llm chat client.
AI Desktop Assistant - a lightweight AI agent that demonstrates tool calling.
Standard interface to connect various LLMs with a unified tool-based protocol.
Framework-agnostic, modular LLM client with parser-aware retries, async/sync OpenAI support, and plug-in adapters for LangChain, LlamaIndex, and others.
Ask any LLM a question via your terminal.
A customizable and extensible client api for managing conversations and AI interactions, currently supporting the Google Gemini API — with flexibility to support any similar AI APIs.
🤖 Build and integrate AI features easily with Tiny-AI-API, a lightweight and efficient API designed for seamless implementation in your projects.
Add a description, image, and links to the llm-client topic page so that developers can more easily learn about it.
To associate your repository with the llm-client topic, visit your repo's landing page and select "manage topics."