Skip to content

Pluggable LLM adapters (Groq/OpenAI/local) #12

@StressTestor

Description

@StressTestor

Define adapter interface and support Groq/OpenAI/local (Ollama) via config/env. Acceptance: switching providers via env/config; same UX in CLI/wake modes.

Metadata

Metadata

Assignees

No one assigned

    Labels

    2.xNext seriesarchitectureArchitecture/designfeatureNew featurellmLLM integration

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions