Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Opens `http://localhost:3777` with a setup wizard:

1. **Wallet** — detects your `mltl` wallet (auto-created on first run)
2. **Agent** — registers onchain with name, description, skills, and price
3. **LLM** — connects Anthropic, OpenAI, or OpenRouter (with a live test call)
3. **LLM** — connects Anthropic, OpenAI, OpenRouter, or MiniMax (with a live test call)
4. **Config** — pricing strategy, automation toggles, task limits

After setup, the dashboard launches and the agent starts working.
Expand Down Expand Up @@ -105,8 +105,9 @@ All providers use raw `fetch()` — zero SDK dependencies:
| Anthropic | `api.anthropic.com/v1/messages` | `claude-sonnet-4-20250514` |
| OpenAI | `api.openai.com/v1/chat/completions` | `gpt-4o` |
| OpenRouter | `openrouter.ai/api/v1/chat/completions` | `openai/gpt-5.4` |
| MiniMax | `api.minimax.io/v1/chat/completions` | `MiniMax-M2.7` |

OpenAI and OpenRouter use a shared adapter that translates between Anthropic's native tool-use format and OpenAI's `tool_calls` format.
OpenAI, OpenRouter, and MiniMax use a shared adapter that translates between Anthropic's native tool-use format and OpenAI's `tool_calls` format.

## Self-Learning

Expand Down
3 changes: 2 additions & 1 deletion src/config.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ import path from "node:path";
import os from "node:os";

export interface LLMConfig {
provider: "anthropic" | "openai" | "openrouter";
provider: "anthropic" | "openai" | "openrouter" | "minimax";
model: string;
apiKey: string;
}
Expand Down Expand Up @@ -118,6 +118,7 @@ export function initConfig(opts: {
anthropic: "claude-sonnet-4-20250514",
openai: "gpt-4o",
openrouter: "anthropic/claude-sonnet-4-20250514",
minimax: "MiniMax-M2.7",
};

const config: CashClawConfig = {
Expand Down
5 changes: 5 additions & 0 deletions src/llm/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -236,6 +236,11 @@ export function createLLMProvider(config: LLMConfig): LLMProvider {
config,
"https://openrouter.ai/api/v1",
);
case "minimax":
return createOpenAICompatibleProvider(
config,
"https://api.minimax.io/v1",
);
default:
throw new Error(`Unknown LLM provider: ${config.provider}`);
}
Expand Down
1 change: 1 addition & 0 deletions src/ui/pages/Settings.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -247,6 +247,7 @@ export function Settings() {
<option value="anthropic">Anthropic</option>
<option value="openai">OpenAI</option>
<option value="openrouter">OpenRouter</option>
<option value="minimax">MiniMax</option>
</select>
</Field>
<Field label="Model">
Expand Down
1 change: 1 addition & 0 deletions src/ui/pages/setup/LLMStep.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ const PROVIDERS = [
{ value: "anthropic", label: "ANTHROPIC", desc: "Claude models", model: "claude-sonnet-4-20250514" },
{ value: "openai", label: "OPENAI", desc: "GPT-4o", model: "gpt-4o" },
{ value: "openrouter", label: "OPENROUTER", desc: "Multi-provider", model: "openai/gpt-5.4" },
{ value: "minimax", label: "MINIMAX", desc: "MiniMax M2.7", model: "MiniMax-M2.7" },
];

export function LLMStep({ onNext }: LLMStepProps) {
Expand Down