The AI agent script CLI for Programmable Prompt Engine.
-
Updated
Apr 5, 2025 - TypeScript
The AI agent script CLI for Programmable Prompt Engine.
OfflineAI is an artificial intelligence that operates offline and uses machine learning to perform various tasks based on the code provided. It is built using two powerful AI models by Mistral AI.
An excellent localized AI chat client application, cross-platform, compatible with all large models compatible with Ollama and OpenAI API. Local deployment protects your data privacy and can be used as Ollama client and OpenAI client.
Demo project showcasing the integration and usage of Chrome’s built-in Gemini Nano AI through the window.ai interface.
Efficient on-device offline AI model inference using MediaPipe with optimized model screening.
A private, local RAG (Retrieval-Augmented Generation) system using Flowise, Ollama, and open-source LLMs to chat with your documents securely and offline.
ollama client for android
OmniBot - Run LLMs natively & privately in your browser
AI assistant for frontline health workers to improve maternal care, nutrition, and scheme access.
LocalPrompt is an AI-powered tool designed to refine and optimize AI prompts, helping users run locally hosted AI models like Mistral-7B for privacy and efficiency. Ideal for developers seeking to run LLMs locally without external APIs.
LLM-DeskAI は、完全オフラインで動作するLLM搭載のデスクトップAIチャットアプリを目指しています。
A self-hosted AI chatbot for privacy-conscious users. Runs locally with Ollama, ensuring data never leaves your device. Built with SvelteKit for performance and flexibility. No external dependencies—your AI, your rules. 🚀
Add a description, image, and links to the offline-ai topic page so that developers can more easily learn about it.
To associate your repository with the offline-ai topic, visit your repo's landing page and select "manage topics."