AiSH (Artificially Intelligent Shell) is an autonomous, AI-driven shell assistant that interprets user input, generates shell commands, and executes them on your real machine. It supports both online (cloud LLMs) and offline (Ollama) modes, with robust error handling, command history, and dynamic configuration.
Beta Version 0.2
- AI-Powered Command Generation: Converts natural language into shell commands using LLMs (Groq, Gemini, OpenRouter, Ollama).
- Autonomous Task Processing: Breaks down complex tasks into step-by-step shell commands and executes them.
- Error Correction: Uses AI to analyze and retry failed commands.
- Interactive Shell: Enhanced prompt with auto-completion, history, and customizable themes.
- Secure API Key Storage: API keys are encrypted using Fernet.
- Cross-Platform: Works on Linux, macOS, and Windows.
pip install -r requirements.txt- On first run,
.aishrcis created in your home directory. - It is recommended to add your API keys and preferred models using the commands:
/config api edit <api> key <value>/config api edit <api> model <value>
- You can also add them directly in
.aishrc, but above method is preferred.
API keys are securely encrypted automatically.
python src/aish.py- Enter natural language requests (e.g.,
list files in home directory). - Use
/helpfor a list of commands.
| Command | Description |
|---|---|
/help or /h |
Show help |
/verbose or /v |
Toggle verbose mode |
/config or /c |
Configure APIs, history, etc. |
/prompt [theme] |
Change prompt theme (default, pwd, mood) |
/exit or /e |
Exit AiSH |
!<cmd> |
Execute raw shell command |
- Use
/config api current <api>to set the active AI provider. - Use
/config api edit <api> key <key>to update API keys. - Use
/config api edit <api> model <model>to update models. - Use
/config prev_cmds <n>to set command history length in prompts.
- API keys are securely stored and never saved in plaintext.
- Your sensitive configuration remains protected at all times.
list files and folderswrite hello world python programhow much disk space is left?is docker running?
- If you see errors about missing API keys, edit
.aishrcand add your keys. - For offline mode, ensure Ollama is running and the model is set in
.aishrc.
For more details, see the code in the src/ directory.
