clai
integrates AI models of multiple vendors via with the terminal.
You can generate images, text, summarize content and chat while using native terminal functionality, such as pipes and termination signals.
The multi-vendor aspect enables easy comparisons between different models, also removes the need for multiple subscriptions: most APIs are usage-based (some with expiration time).
- Prompting with input from:
- Piped data
- Globbed file input
- Args
- Conversations (with same input options as above)
- Tools calling with easily forkable + extendable tools
- LLM Profiles - Preconfigured prompts with specific tools
- Photo generation*
- Human readable / robot readable output
- 100% go standard library (except for /x/net)
* Only with dall-e for the moment. Nag on me to implement modellabs and I'll do it.
- Go: Install Go from here.
- OpenAI API Key: Set the
OPENAI_API_KEY
env var to your OpenAI API key. Text models, photo models. - Anthropic API Key: Set the
ANTHROPIC_API_KEY
env var to your Anthropic API key. Text models. - Mistral API Key: Set the
MISTRAL_API_KEY
env var to your Mistral API key. Text models - Novita AI: Set the
NOVITA_API_KEY
env var to your Novita API key. Target the model using novita prefix, like this:novita:<target>
, where<target>
is one of the text models. - Ollama: Start your ollama server (defaults to localhost:11434). Target using model format
ollama:<target>
, where<target>
is optional (defaults to llama3). Reconfigure url withclai s -> 1 -> <ollama-model-conf>
- Glow(Optional): Install Glow for formatted markdown output when querying text responses.
Note that you can only use the models that you have bought an API key for.
Most text and photo based models within the respective vendors are supported, see model configurations for how to swap.
go install github.com/baalimago/clai@latest
You may also use the setup script:
curl -fsSL https://raw.githubusercontent.com/baalimago/clai/main/setup.sh | sh
Either look at clai help
or the examples for how to use clai
.
This project was originally inspired by: https://github.com/Licheam/zsh-ask, many thanks to Licheam for the inspiration.