Skip to content
/ clai Public

Command line artificial intelligence - Because copy-pasting to a browser is glacial.

License

Notifications You must be signed in to change notification settings

baalimago/clai

Repository files navigation

clai: command line artificial intelligence

Go Report Card Wakatime

clai integrates AI models of multiple vendors via cli. You can generate images, text, summarize content and chat while using native terminal functionality, such as pipes and termination signals.

It's not (only) a LLM powered command suggester, instead it's a cli native LLM context feeder designed to fit into each user's own workflows.

The multi-vendor aspect enables easy comparisons between different models, also removes the need for multiple subscriptions: most APIs are usage-based (some with expiration time).

Features

Piping into LLM: piping

Easily configurable profiles (note the built in tools!): profiles

Conversation history and simple TUI to browse and continue old chats: chats

These are the core features which can be combined. For instance, you can pipe data into an existing chat. Continue a chat with another profile, or another chat model.

All the configuration files and chats are json, so manual tweaks and manipulation is easy to do.

If you have time, checkout this blogpost for a slightly more structured introduction on how to use clai efficiently.

Supported vendors

  • OpenAI API Key: Set the OPENAI_API_KEY env var to your OpenAI API key. Text models, photo models.
  • Anthropic API Key: Set the ANTHROPIC_API_KEY env var to your Anthropic API key. Text models.
  • Mistral API Key: Set the MISTRAL_API_KEY env var to your Mistral API key. Text models
  • Deepseek: Set the DEEPSEEK_API_KEY env var to your Deepseek API key. Text models
  • Novita AI: Set the NOVITA_API_KEY env var to your Novita API key. Target the model using novita prefix, like this: novita:<target>, where <target> is one of the text models.
  • Ollama: Start your ollama server (defaults to localhost:11434). Target using model format ollama:<target>, where <target> is optional (defaults to llama3). Reconfigure url with clai setup -> 1 -> <ollama-model-conf>

Note that you can only use the models that you have bought an API key for.

Get started

go install github.com/baalimago/clai@latest

You may also use the setup script:

curl -fsSL https://raw.githubusercontent.com/baalimago/clai/main/setup.sh | sh

Either look at clai help or the examples for how to use clai.

Install Glow for formatted markdown output when querying text responses.

About

Command line artificial intelligence - Because copy-pasting to a browser is glacial.

Topics

Resources

License

Stars

Watchers

Forks