A powerful desktop AI assistant that combines Claude API and local Ollama models for task automation, autonomous exploration, and intelligent enhancement capabilities.
- Code Generation: Write, debug, and optimize code in multiple programming languages
- Script Creation: Generate automation scripts for system tasks and workflows
- File Processing: Batch process files with intelligent operations
- Image Analysis: Describe, analyze, and extract information from images
- Audio Processing: Transcribe and analyze audio files (planned)
- Video Enhancement: Process and enhance video content (planned)
- CSV Analysis: Analyze data files and generate insights
- Visualization: Create charts and graphs from data
- Statistical Analysis: Perform advanced data analysis
- RAG Processing: Summarize and analyze documents with retrieval-augmented generation
- Q&A Systems: Answer questions based on document content
- Content Extraction: Extract key information from various file formats
- Context Awareness: Remember previous tasks and conversations
- Learning: Improve responses based on interaction history
- Personalization: Adapt to user preferences and patterns
- macOS 10.15+ (optimized for macOS, but works on other platforms)
- Python 3.9 or higher
- 8GB+ RAM recommended
- Internet connection for Claude API (optional)
# Clone the repository
git clone https://github.com/rohnspringfield/supermini.git
cd supermini
# Run automated setup
chmod +x dependencies/install.sh
./dependencies/install.sh# Install Python dependencies
pip3 install PyQt6 anthropic requests pandas numpy psutil chromadb
# Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# Start Ollama service
ollama serve &
# Pull recommended models
ollama pull qwen2.5-coder:7b
ollama pull llama3.2:3b
# Create directories
mkdir -p ~/SuperMini_Output/data ~/SuperMini_Output/logs
# Run the application
python3 supermini.py- Launch SuperMini: Run
python3 supermini.py - Configure API Key (optional but recommended):
- Click "⚙️ Settings"
- Enter your Claude API key from Anthropic Console
- Save settings
- Verify Ollama: Ensure local models are downloaded and running
- Start Creating: Enter your first task and click "🚀 Process Task"
Task: "Create a Python script to sort a CSV file by date column"
Files: data.csv
Result: Complete Python script with error handling
Task: "Analyze this screenshot and describe the UI elements"
Files: screenshot.png
Result: Detailed description of interface components
Task: "Analyze sales data and create visualizations"
Files: sales_data.csv
Result: Statistical analysis + Python visualization code
Task: "Summarize the key points from these research papers"
Files: paper1.pdf, paper2.pdf
Result: Comprehensive summary with key insights
Task: "Create a backup script for my Documents folder"
Result: Bash script with scheduling options
- Claude API: Primary model for complex reasoning (requires API key)
- Ollama Local: Fallback models that run locally
qwen2.5-coder:7b- Best for coding tasksllama3.2:3b- General purpose, faster responses
- Max Tokens: Control response length (512-8192)
- Temperature: Creativity level (0.0-1.0)
- Memory: Enable/disable context awareness
- Task Types: Auto-detection or manual selection
~/SuperMini_Output/
├── data/
│ ├── memory/ # ChromaDB memory database
│ ├── collaboration/ # Task sharing data
│ └── generated_* # Your generated files
└── logs/
└── supermini.log # Application logs
- Description: Programming, scripting, debugging
- Input: Code requirements, existing files
- Output: Complete scripts, functions, or applications
- Examples: "Fix this Python script", "Create a web scraper"
- Description: Image, audio, and video processing
- Input: Media files, processing requirements
- Output: Analysis, enhanced files, descriptions
- Examples: "Describe this image", "Enhance video quality"
- Description: Document analysis and question answering
- Input: Documents, questions
- Output: Summaries, answers, insights
- Examples: "Summarize these PDFs", "Answer questions about this document"
- Description: System tasks and workflow automation
- Input: Task requirements, file operations
- Output: Scripts, shortcuts, automation workflows
- Examples: "Backup my files", "Organize downloads folder"
- Description: Data analysis and visualization
- Input: Data files (CSV, JSON), analysis requirements
- Output: Statistical analysis, charts, insights
- Examples: "Analyze sales trends", "Create data dashboard"
SuperMini remembers your previous tasks and uses them for context:
- Automatic Learning: Improves responses based on your preferences
- Context Awareness: References previous tasks when relevant
- Pattern Recognition: Learns your common workflows
SuperMini can improve itself autonomously and contribute back to the open-source project:
- 🔍 Self-Analysis: Automatically analyzes code quality and performance
- ✨ Code Enhancement: Suggests and implements improvements with safety validation
- 🔄 GitHub Integration: Automatically creates pull requests back to the repository
- 🛡️ Safety Framework: Multi-layer validation and rollback capabilities
- 🤝 Community Contributions: Autonomous agents help with open-source development
- 📊 Impact Tracking: Measures and reports enhancement effectiveness
- 🎯 Smart Targeting: Only modifies safe files with appropriate permissions
Track resource usage in real-time:
- CPU Usage: Monitor processing load
- Memory Usage: Track RAM consumption
- Performance Metrics: Optimize for your system
- Activity Logging: Comprehensive logging of all operations
SuperMini v2.1.0 introduces groundbreaking autonomous contribution capabilities:
- 🤖 Automatic Pull Requests: Creates PRs automatically when improvements are found
- 🛡️ Safety Controls: Strict validation ensures only safe, beneficial changes
- 📋 Professional Templates: Uses proper PR templates with detailed change descriptions
- 🎯 Smart File Filtering: Only modifies approved file types (*.py, *.md, etc.)
- 🔍 Human Oversight: All autonomous PRs require human review before merging
- 📊 Impact Measurement: Tracks and reports enhancement effectiveness
# 1. Set your GitHub token (optional, for autonomous PRs)
export GITHUB_TOKEN="your-github-token"
# 2. Enable autonomous mode in SuperMini
# - Check "Autonomous Mode" in the interface
# - Configure GitHub integration in Settings
# - Review autonomous suggestions before approval
# 3. Monitor autonomous contributions
# - Check GitHub for automatically created PRs
# - Review and merge beneficial changes
# - Provide feedback to improve AI suggestions- Local Storage: All data stored securely on your machine
- Task History: Complete record of all interactions
- Export Options: Share results and generated files
- Open Source Workflow: Seamless integration with Git and GitHub
- Autonomous Contributions: AI can contribute improvements back to the project
# Check Ollama status
ollama list
# Restart Ollama service
pkill ollama
ollama serve &
# Re-download models
ollama pull qwen2.5-coder:7b- Verify API key in Settings
- Check internet connection
- Ensure sufficient API credits
# Clear memory database
rm -rf ~/SuperMini_Output/data/memory/*
# Restart application# Update pip and dependencies
pip3 install --upgrade pip
pip3 install --upgrade -r requirements.txt- Use smaller Ollama models:
llama3.2:1b - Reduce max tokens in settings
- Disable memory for simple tasks
- Use Claude API for complex tasks
- Enable memory for context
- Use larger models:
qwen2.5-coder:14b
- Local First: All processing can be done locally with Ollama
- No Data Collection: Your tasks and files stay on your machine
- Optional Cloud: Claude API only used when configured
- Secure Storage: All data encrypted and stored locally
We welcome contributions! Here's how to get started:
- Fork the Repository
- Create Feature Branch:
git checkout -b feature/amazing-feature - Make Changes: Implement your improvements
- Test Thoroughly: Ensure everything works
- Submit Pull Request: Describe your changes
# Clone repository
git clone https://github.com/rohnspringfield/supermini.git
cd supermini
# Install development dependencies
pip3 install -r requirements.txt
pip3 install -r requirements-test.txt
# Run in development mode
python3 supermini.py- Voice input/output support
- Plugin system for custom tools
- Web interface option
- Enhanced video processing
- Multi-language support
- Advanced automation workflows
- Team collaboration features
- Cloud synchronization option
This project is licensed under the MIT License - see the LICENSE file for details.
- Anthropic for the Claude API
- Ollama for local AI model hosting
- Qt/PyQt for the user interface framework
- ChromaDB for vector memory storage
- Documentation: Wiki
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- Project: Open-source AI desktop assistant
Made with ❤️ for the AI community
SuperMini - Where AI meets productivity