Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 20 additions & 15 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ An Obsidian plugin that integrates with LLM CLI tools (Claude, Codex, OpenCode,
- **Create Notes from Responses** - Save LLM responses as new notes in your vault
- **Quick Prompts** - Commands for summarizing, explaining, and improving selected text
- **Session Continuation** - Follow-up messages use session resumption for faster responses
- **ACP Mode** - Optional persistent connection mode for faster multi-turn conversations

## Requirements

Expand Down Expand Up @@ -93,8 +94,13 @@ Copy `main.js`, `manifest.json`, and `styles.css` to your vault's `.obsidian/plu

Each provider can be configured with:
- Enable/disable
- Model selection (dropdown with common models, or enter custom model ID)
- Custom command (if CLI is named differently)
- Timeout override
- **ACP Mode** (experimental) - Use Agent Client Protocol for persistent connections
- **Thinking Mode** (ACP only) - Control extended thinking level (none/low/medium/high)

When ACP mode is enabled, the available models list is populated dynamically from the connected agent.

### System Prompt

Expand Down Expand Up @@ -153,28 +159,27 @@ The plugin captures session IDs from CLI tools and uses them for subsequent requ
- **Claude**: `--resume <session_id>`
- **OpenCode**: `--session <session_id>`
- **Gemini**: `--resume <session_id>`
- **Codex**: Uses `resume` subcommand (different pattern, not fully supported)
- **Codex**: `--resume <session_id>`

This improves response times for follow-up messages. Clearing the conversation resets the session.

### Future Improvements
### ACP Mode (Agent Client Protocol)

**ACP (Agent Client Protocol)**: We're exploring ACP support for improved performance and capabilities. ACP is a standardized protocol (like LSP for AI agents) that allows:
- Long-lived agent processes (no startup overhead per request)
- Streaming responses via JSON-RPC
- Terminal integration (agents can run commands)
- Standardized session management
ACP is a standardized protocol (like LSP but for AI agents) that provides:
- **Persistent connections** - No startup overhead per request
- **Streaming responses** - Real-time updates via JSON-RPC
- **Dynamic model discovery** - Available models fetched from the agent
- **Extended thinking** - Support for thinking mode levels

CLI tools with ACP support:
- `opencode acp` - OpenCode's ACP server
- `gemini --experimental-acp` - Gemini CLI's ACP mode
- Claude Code - via `@zed-industries/claude-code-acp` adapter
All four providers support ACP mode:
- **Claude** - via `@anthropic-ai/claude-code-acp` adapter
- **OpenCode** - native `opencode acp` server
- **Gemini** - via `gemini --experimental-acp` flag
- **Codex** - via `@zed-industries/codex-acp` adapter

See also: [obsidian-agent-client](https://github.com/RAIT-09/obsidian-agent-client) for an alternative ACP-based approach.
Enable ACP in provider settings. The first message establishes a connection; subsequent messages reuse it for faster responses.

**Long-lived CLI Process**: As an alternative to ACP, some tools support headless server modes:
- `opencode serve` / `opencode attach` - Headless server mode
- `codex mcp-server` - MCP server mode
See also: [obsidian-agent-client](https://github.com/RAIT-09/obsidian-agent-client) for an alternative ACP-based approach

## License

Expand Down
71 changes: 64 additions & 7 deletions main.ts
Original file line number Diff line number Diff line change
Expand Up @@ -197,6 +197,29 @@ export default class LLMPlugin extends Plugin {
if (leaf) {
workspace.revealLeaf(leaf);
}

return leaf;
}

/**
* Get the ChatView instance if it exists
*/
getChatView(): ChatView | null {
const leaves = this.app.workspace.getLeavesOfType(CHAT_VIEW_TYPE);
if (leaves.length > 0) {
return leaves[0].view as ChatView;
}
return null;
}

/**
* Add a message exchange to the chat view
*/
addToChatView(userMessage: string, assistantMessage: string, provider: LLMProvider) {
const chatView = this.getChatView();
if (chatView) {
chatView.addMessageExchange(userMessage, assistantMessage, provider);
}
}

async loadSettings() {
Expand Down Expand Up @@ -258,8 +281,10 @@ export default class LLMPlugin extends Plugin {
/**
* Update the status bar with current provider and model info
* @param provider Optional provider to display (uses default if not specified)
* @param actualModelName Optional actual model name from ACP session (overrides configured model display)
* @param status Optional status: "idle" (default), "connecting", "connected"
*/
updateStatusBar(provider?: LLMProvider) {
updateStatusBar(provider?: LLMProvider, actualModelName?: string, status?: "idle" | "connecting" | "connected") {
if (!this.statusBarEl) return;

const displayProvider = provider ?? this.settings.defaultProvider;
Expand All @@ -278,8 +303,13 @@ export default class LLMPlugin extends Plugin {

// Build status text with provider and model
let statusText = providerNames[displayProvider] || displayProvider;
if (providerConfig?.model) {
// Show abbreviated model name
if (status === "connecting") {
statusText += " (connecting...)";
} else if (actualModelName) {
// Use actual model name from ACP session
statusText += ` (${this.formatModelName(actualModelName)})`;
} else if (providerConfig?.model) {
// Show configured model name
statusText += ` (${this.formatModelName(providerConfig.model)})`;
} else {
// Indicate CLI default is being used
Expand All @@ -291,8 +321,10 @@ export default class LLMPlugin extends Plugin {
cls: "llm-status-text",
});

// Check if provider is enabled
if (providerConfig?.enabled) {
// Set indicator state based on status
if (status === "connecting") {
indicator.addClass("connecting");
} else if (providerConfig?.enabled) {
indicator.addClass("active");
}
}
Expand All @@ -301,7 +333,7 @@ export default class LLMPlugin extends Plugin {
* Format model name for display (abbreviate long names)
*/
private formatModelName(model: string): string {
// Common abbreviations
// Common abbreviations for model IDs
const abbreviations: Record<string, string> = {
"claude-3-5-haiku-latest": "haiku",
"claude-3-5-sonnet-latest": "sonnet-3.5",
Expand All @@ -319,8 +351,33 @@ export default class LLMPlugin extends Plugin {
"gpt-5": "5",
"claude-sonnet": "sonnet",
"claude-haiku": "haiku",
// ACP display names (from Claude ACP adapter)
"default": "opus",
"Default (recommended)": "opus",
"Sonnet": "sonnet",
"Haiku": "haiku",
};

return abbreviations[model] || model;
// Check for exact match first
if (abbreviations[model]) {
return abbreviations[model];
}

// Try case-insensitive match
const lowerModel = model.toLowerCase();
for (const [key, value] of Object.entries(abbreviations)) {
if (key.toLowerCase() === lowerModel) {
return value;
}
}

// If model name is long, try to extract a shorter name
// Remove text in parentheses and trim
const simplified = model.replace(/\s*\([^)]*\)\s*/g, "").trim();
if (simplified !== model && simplified.length > 0) {
return this.formatModelName(simplified);
}

return model;
}
}
43 changes: 35 additions & 8 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

3 changes: 3 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -42,5 +42,8 @@
"typescript": "^5.0.0",
"wdio-obsidian-reporter": "^2.2.1",
"wdio-obsidian-service": "^2.2.1"
},
"dependencies": {
"@agentclientprotocol/sdk": "^0.13.1"
}
}
Loading