SideLlama is a plugin for Obsidian that allows you to chat with local AI models using Ollama. It provides a user-friendly interface to interact with your models, making it easy to generate text, answer questions, and more.
This plugin is designed to enhance your note-taking experience by integrating AI capabilities directly into your Obsidian workspace.
- 💬 Chat interface for interacting with Ollama models
- 🎯 Customizable model selection
- 🔧 Adjustable temperature settings
- 🚀 Streaming responses for real-time interaction
- 📝 Add responses directly to your notes
- Obsidian v0.15.0 or higher
- Ollama installed and running locally
- gemma3 downloaded and available in your Ollama models
- Open Obsidian Settings
- Navigate to Community Plugins and disable Safe Mode
- Click on "Browse" and search for this plugin
- Install the plugin
- Enable the plugin in your Obsidian settings
- Open the command palette (Ctrl+P or Cmd+P)
- Type "Ollama" to find the commands
- Select "Open Chat in Sidebar" to open the chat interface
- Start chatting with your local AI model
- Click the Ollama icon in the left ribbon
- Type your message in the text area
- Click "Ask" to get a response
- Use "Add to Editor" to insert responses into your notes
- Use "Clear Chat" to start a fresh conversation
Open Chat in Sidebar
: Open the chat interface
You can configure the following settings:
- Model Name: Choose which Ollama model to use (default: gemma3)
- API Endpoint: Set custom API endpoint (default: http://localhost:11434)
- Temperature: Adjust response randomness (0.0 to 1.0) (default: 0.7)
# Clone the repository
git clone https://github.com/asmit404/sidellama.git
# Install dependencies
npm install
# Build the plugin
npm run build
# Development mode
npm run dev
This project is licensed under the MIT License - see the LICENSE file for details.
If you encounter any issues or have suggestions, please visit the GitHub issues page.