A full-stack web application for managing, classifying, and collaborating on visual content. Built for teams that need to organize image datasets, apply custom labels, and integrate machine learning analysis results.
Windows Users: PowerShell equivalents are available for development scripts:
ps_launch_postgres_minio.ps1- Launch containersbackend\ps_run.ps1- Run backend serverfrontend\ps_run.ps1- Run frontend dev server
- Project Organization - Group images into projects with team-based access control
- Image Classification - Apply custom labels and categories to organize images
- Team Collaboration - Add comments and share insights with your team
- Metadata Management - Store custom key-value metadata for projects and images
- ML Analysis Integration - Visualize machine learning results with interactive overlays (bounding boxes, heatmaps)
- Safe Deletion - Two-stage deletion with 60-day recovery period
- API Access - RESTful API with comprehensive documentation and API key authentication
- Node.js 22+
- Python 3.11+
- Podman (for PostgreSQL and MinIO)
- uv - Python package manager (
pip install uv)
This assumes you're using the dev container which has all required tools (Node.js, Python, Podman, etc.) pre-installed.
# Copy example environment file
cp .env.example .envEdit .env and set these for development:
DEBUG=true
SKIP_HEADER_CHECK=true# Start PostgreSQL and MinIO containers
bash short-cut-launch-postgres-minio.sh# Create virtual environment
uv venv .venv
# Activate the virtual environment
source .venv/bin/activate
# Install dependencies
uv pip install -r requirements.txtTerminal 1 - Frontend:
cd frontend
npm install # First time only
npm run devTerminal 2 - Backend:
cd backend
# Run database migrations (first time or after model changes)
alembic upgrade head
# Start backend server
bash run.shOpen http://localhost:3000 in your browser to see the app.
- Frontend: http://localhost:3000
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/docs (Swagger UI)
- Alternative API Docs: http://localhost:8000/redoc (ReDoc)
All configuration is managed through the .env file in the repository root. This file is required for the application to run properly.
Setup:
# Copy the example file (contains all default settings)
cp .env.example .env
# Source it before running backend commands
set -a && source .env && set +aKey Configuration Sections:
Database (PostgreSQL):
DATABASE_URL=postgresql+asyncpg://postgres:postgres@localhost:5433/postgres
POSTGRES_USER=postgres
POSTGRES_PASSWORD=postgres
POSTGRES_DB=postgres
POSTGRES_SERVER=localhost
POSTGRES_PORT=5433Important:
DATABASE_URLmust be set or Alembic migrations will fail. The default uses PostgreSQL on port 5433.
S3/MinIO Storage:
S3_ENDPOINT=localhost:9000
S3_ACCESS_KEY=minioadmin
S3_SECRET_KEY=minioadminpassword
S3_BUCKET=data-storage
S3_USE_SSL=falseAuthentication:
# Development (uses mock user)
SKIP_HEADER_CHECK=false
CHECK_MOCK_MEMBERSHIP=true
MOCK_USER_EMAIL=test@example.com
MOCK_USER_GROUPS_JSON='["admin-group", "data-scientists", "project-alpha-group"]'
# Production (reverse proxy authentication)
PROXY_SHARED_SECRET=your-secure-secret-here
AUTH_SERVER_URL=https://your-auth-server.com
X_USER_ID_HEADER=X-User-Email
X_PROXY_SECRET_HEADER=X-Proxy-SecretML Analysis (Optional):
ML_ANALYSIS_ENABLED=true
ML_CALLBACK_HMAC_SECRET=your-hmac-secret
ML_ALLOWED_MODELS=yolo_v8,resnet50_classifierApplication Settings:
APP_NAME="Data Management API"
DEBUG=false
CACHE_SIZE_MB=1000
FRONTEND_BUILD_PATH=frontend/buildImportant: Migrations are NOT run automatically. This is intentional to prevent accidental schema changes in production.
# Activate virtual environment and load environment variables
source .venv/bin/activate
set -a && source .env && set +a
# Run migrations
cd backend
alembic upgrade headCommon Issue: If you get "table already exists" errors, check that
DATABASE_URLin.envpoints to PostgreSQL, not SQLite.
After modifying models in core/models.py:
cd backend
alembic revision --autogenerate -m "describe your changes"Always review the generated migration file before applying!
alembic upgrade head # Apply all pending migrations
alembic downgrade -1 # Rollback last migration
alembic history --verbose # View migration history
alembic current # Show current database version
alembic stamp head # Mark DB as up-to-date (use cautiously)| Issue | Solution |
|---|---|
| "table already exists" error | Verify DATABASE_URL in .env is set to PostgreSQL, not SQLite |
| "alembic: command not found" | Activate virtual environment: source .venv/bin/activate |
| Autogenerate misses table | Verify model is imported in core/models.py |
| Dialect errors | Ensure .env is sourced and DATABASE_URL uses PostgreSQL |
| Schema drift | Run alembic upgrade head then regenerate migration |
| Connection refused | Check PostgreSQL is running: podman ps |
| Issue | Solution |
|---|---|
| Autogenerate misses table | Verify model is imported in core/models.py |
| Dialect errors | Ensure using PostgreSQL URL, not SQLite |
| Schema drift | Run alembic upgrade head then regenerate migration |
Backend tests (pytest):
source .venv/bin/activate
cd backend
pytest # Run all tests
pytest tests/test_specific.py # Run specific file
pytest tests/test_file.py::test_function # Run specific test
pytest -v # Verbose output
pytest -k "auth" # Run tests matching pattern
pytest --coverage # With coverage reportFrontend tests (Jest):
cd frontend
npm test # Interactive mode
npm test -- --coverage # With coverage reportFrontend:
cd frontend
npm run buildPodman image:
podman build -t vista .
podman run -p 8000:8000 vistaAccess at http://localhost:8000
This feature allows external ML pipelines to submit analysis results for visualization in the UI. Users cannot trigger analyses directly - they are initiated by external systems.
- Navigate to an image in the web interface
- If ML analyses exist, view them in the "ML Analyses" sidebar panel
- Toggle bounding box and heatmap overlays
- Adjust visualization opacity or use side-by-side comparison
- Export results as JSON or CSV
ML_ANALYSIS_ENABLED=true
ML_CALLBACK_HMAC_SECRET=your-secure-secret
ML_ALLOWED_MODELS=yolo_v8,resnet50_classifier,custom_model- Create analysis -
POST /api/images/{image_id}/analyses - Update to processing -
PATCH /api/analyses/{analysis_id}/status - Request presigned URL -
POST /api/analyses/{analysis_id}/artifacts/presign - Upload artifacts to S3 -
PUT <presigned_url> - Submit annotations -
POST /api/analyses/{analysis_id}/annotations:bulk - Finalize -
POST /api/analyses/{analysis_id}/finalize
Security: Pipeline endpoints require HMAC authentication:
- Header:
X-ML-Signature(HMAC-SHA256 of request body) - Header:
X-ML-Timestamp(Unix timestamp for replay protection)
export ML_CALLBACK_HMAC_SECRET='your_secret_here'
python scripts/test_ml_pipeline.py --image-id <uuid>For detailed integration guide, see API documentation at http://localhost:8000/docs
Production deployments require a reverse proxy for authentication. See comprehensive documentation:
- Setup Guide:
docs/production/proxy-setup.md - Nginx Example:
docs/production/nginx-example.conf
The application uses header-based authentication where the reverse proxy authenticates users and forwards their identity to the backend via HTTP headers.
podman build -t vista .
podman run -p 8000:8000 vistaAccess at http://localhost:8000
Test deployment using minikube:
# Start minikube
minikube start
# Build and load image
podman build -t vista:latest .
minikube image load vista:latest
# Deploy
kubectl apply -f deployment-test/
# Access application
minikube service vista --urlSee deployment-test/ directory for Kubernetes manifests.
Note: The dev container includes minikube, kubectl, and helm pre-installed.
- Backend: FastAPI (Python 3.11+) with async SQLAlchemy
- Frontend: React 18 with React Router
- Database: PostgreSQL 15
- Storage: MinIO/S3 compatible object storage
- Migrations: Alembic
- Caching: aiocache + diskcache
backend/
├── main.py # Application entry point
├── core/ # Core components (models, schemas, config)
├── routers/ # API endpoint definitions
├── middleware/ # Authentication, CORS, security headers
├── utils/ # Shared utilities (CRUD, caching, S3)
├── alembic/ # Database migrations
└── tests/ # Backend tests
frontend/
├── src/
│ ├── App.js # Main application component
│ ├── Project.js # Project view
│ ├── ImageView.js # Image detail view
│ └── components/ # Reusable React components
- Development: Mock user from environment variables
- Production: Header-based authentication via reverse proxy
- Group-based access control: Projects belong to groups
- API keys: Programmatic access via API key authentication
Comprehensive guides for different user roles:
- User Guide - Complete guide for end users
- Getting started
- Managing projects and images
- Classification and collaboration
- ML analysis visualization
- Administrator Guide - Deployment and maintenance
- Installation & Deployment - Podman, Kubernetes, manual setup
- Configuration - Environment variables and settings
- Authentication & Security - Reverse proxy setup and security
- Database Management - PostgreSQL, migrations, backups
- Storage Configuration - S3/MinIO setup and management
- Monitoring & Maintenance - Logging, metrics, updates
- Troubleshooting - Common issues and solutions
- Developer Guide - Development and contribution guide
- Development environment setup
- Architecture overview
- Backend and frontend development
- API development
- Testing and code style
- ML Analysis API Guide - Machine learning integration
- Authentication and workflow
- API endpoints and data formats
- Example implementations
- Testing and best practices
- Production Proxy Setup - Detailed reverse proxy configuration
- API Documentation: http://localhost:8000/docs (Swagger UI)
- Alternative API Docs: http://localhost:8000/redoc (ReDoc)
- pgAdmin (Database UI): http://localhost:8080 (user: admin@admin.com, pass: admin)
- MinIO Console: http://localhost:9001 (user: minioadmin, pass: minioadminpassword)
Copyright 2025 National Technology & Engineering Solutions of Sandia, LLC (NTESS). Under the terms of Contract DE-NA0003525 with NTESS, the U.S. Government retains certain rights in this software
MIT License

