feat: add Ollama provider support for local LLMs#19
Open
clawmango0 wants to merge 5 commits intomoltlaunch:mainfrom
Open
feat: add Ollama provider support for local LLMs#19clawmango0 wants to merge 5 commits intomoltlaunch:mainfrom
clawmango0 wants to merge 5 commits intomoltlaunch:mainfrom
Conversation
- Add 'ollama' to LLM provider type in config.ts - Add default model 'llama3' for Ollama - Add Ollama case in createLLMProvider (uses OpenAI-compatible API at localhost:11434) - Add Ollama to setup wizard and settings UI Ollama uses OpenAI-compatible API, so it reuses the existing openai-compatible provider with local base URL.
- Add comprehensive test suite for LLM providers in test/llm.test.ts - Test Ollama-specific behavior: localhost:11434 URL, dummy API key fallback, custom models - Test tool call parsing from Ollama responses - Test error handling for API failures - All 19 tests pass (12 new + 7 existing)
- Create dedicated createOllamaProvider function instead of reusing OpenAI one - No API key required for local Ollama instance - Add helpful error messages for tool calling limitations - Tool calling requires Ollama >= 0.1.20 and specific models (llama3.1, qwen2.5, mistral) - Update tests to match new implementation
- Change default Ollama model from llama3 to llama3.1 (supports tool calling) - Make apiKey optional in LLMConfig interface for Ollama - Update isConfigured() to not require apiKey for Ollama - Add tests for empty string API key handling - Add test for connection errors (Ollama not running) - All 22 tests passing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds Ollama as a new LLM provider option, enabling CashClaw to use local models. This is ideal for users who want to run LLMs locally without sending data to external APIs.
Changes
Backend (
src/)config.ts: AddollamatoLLMConfig.providertype, default modelllama3.1, make apiKey optional for local Ollama, updateisConfigured()to not require apiKey for Ollamallm/index.ts: Create dedicatedcreateOllamaProvider()function with Ollama-specific handling (localhost:11434, no auth required)Frontend (
src/ui/)pages/setup/LLMStep.tsx: Add Ollama option to setup wizardpages/Settings.tsx: Add Ollama option to settings pageTests (
test/)llm.test.ts: Add 15 comprehensive tests for Ollama providerImportant: Tool Calling Support
Ollama's tool calling support varies by model:
The provider includes helpful error messages when tool calling isn't supported by the model.
Requirements
ollama serve)Usage
Example Config
{ "llm": { "provider": "ollama", "model": "llama3.1", "apiKey": "" } }Testing
All tests pass (22 total):
Ollama-specific tests:
Implementation Notes
Dedicated Provider: Unlike OpenAI/OpenRouter which share a provider, Ollama has its own provider function to handle:
Configuration: The
isConfigured()function now properly handles Ollama by not requiring an API key when the provider is "ollama".Tool Calling: CashClaw relies on tool calling for the agent to interact with the marketplace (quote tasks, submit work, etc.). Without tool support, the agent can only do text-based reasoning but cannot execute actions.
Closes #12