If you discover a security vulnerability in Amy, please report it responsibly:
- Do not open a public issue
- Email the maintainers directly or use GitHub's private vulnerability reporting feature
- Include a detailed description of the vulnerability
- Provide steps to reproduce if possible
We will respond within 48 hours and work with you to understand and address the issue.
| Version | Supported |
|---|---|
| Latest | Yes |
Amy uses Ollama with DeepSeek-R1 running locally:
- No external API keys required for default operation
- Data stays local — queries processed by local Ollama instance
- No external network calls for LLM inference
All user inputs are validated before processing:
- Query length limits
- Type checking on inputs
- Unit validation via Unitful.jl
- Bounds checking on numerical operations
The engine does not execute arbitrary code. All computations are limited to predefined tool functions.
- Run in isolated Julia environment with
--project=. - Set resource limits via
AgentConfig(max tool calls, timeouts) - Keep Julia and dependencies updated
- Monitor for unusual query patterns
- Validate external data sources (ephemeris, reference values)
- Queries are not logged by default
- Session data is in-memory only
- No telemetry collected
- Full offline operation supported