A full-stack web application for Bitcoin price prediction using machine learning models, featuring real-time data collection, multiple prediction horizons, and interactive charts.
- Frontend: React + TypeScript + Vite - Modern UI with interactive charts
- Backend: FastAPI (Python) - REST API for data access
- Database: PostgreSQL - Stores historical data, features, and predictions
- ML Pipeline: Python scripts - Data collection, feature engineering, model training, and inference
- Containerization: Docker & Docker Compose
Cryptify/
βββ frontend/ # React frontend application
β βββ src/ # Source code
β β βββ components/ # React components
β β βββ services/ # API service
β β βββ types/ # TypeScript types
β βββ package.json # Node.js dependencies
β βββ vite.config.ts # Vite configuration
βββ backend/ # FastAPI backend service
β βββ app/ # FastAPI application
β β βββ main.py # Main API endpoints
β βββ models/ # Database models (SQLAlchemy)
β βββ services/ # Business logic services
β βββ schemas/ # Pydantic schemas
β βββ requirements.txt # Python dependencies
β βββ Dockerfile # Backend container config
βββ scripts/ # ML pipeline scripts
β βββ data_collector.py # Data collection from exchanges
β βββ multi_model_trainer.py # Model training
β βββ predictor.py # Inference/prediction
β βββ requirements.txt # ML dependencies
βββ docker/ # Docker configuration
β βββ init.sql # Database initialization
βββ docker-compose.yml # Multi-container orchestration
- Docker and Docker Compose
- Node.js 18+ (for frontend development)
- Python 3.12+ (for local development, optional)
# Start PostgreSQL and FastAPI backend
docker-compose up -d
# Check services status
docker-compose ps
# View backend logs
docker-compose logs -f backendBackend will be available at: http://localhost:8000
- API docs:
http://localhost:8000/docs - Health check:
http://localhost:8000/health
cd frontend
# Install dependencies (first time only)
npm install
# Start development server
npm run devFrontend will be available at: http://localhost:5173
# Collect historical data (this may take 10-30 minutes)
curl -X POST http://localhost:8000/ml/data-collector/run \
-H "Content-Type: application/json" \
-d '{"mode": "batch", "timeout": 3600}'
# Train models (this may take 30-60 minutes)
curl -X POST http://localhost:8000/ml/trainer/run \
-H "Content-Type: application/json" \
-d '{"mode": "batch", "timeout": 7200}'
# Generate predictions
curl -X POST http://localhost:8000/ml/predictor/run \
-H "Content-Type: application/json" \
-d '{"timeout": 300}'Or use the API documentation at http://localhost:8000/docs to run these endpoints interactively.
- Ubuntu 20.04+ / Debian 11+ (or other Linux distribution)
- Docker and Docker Compose installed
- Git installed
- Minimum 4GB RAM, 20GB free space
- Open ports: 22 (SSH), 5173 (Frontend), 8000 (Backend API, optional)
# Update system
sudo apt update && sudo apt upgrade -y
# Install Docker
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
sudo usermod -aG docker $USER
newgrp docker
# Install Docker Compose (use built-in version)
# Docker Compose v2 is included with Docker
# Install Node.js 18+
curl -fsSL https://deb.nodesource.com/setup_18.x | sudo -E bash -
sudo apt-get install -y nodejs
# Install Git (if not installed)
sudo apt install git -y# Clone repository
cd ~
mkdir -p projects
cd projects
git clone https://github.com/FaritSharafutdinov/Cryptify cryptify
cd cryptify
git checkout dev
# Fix permissions for scripts directory
chmod -R 755 scripts/# Start PostgreSQL and FastAPI backend
docker-compose up -d
# Check status
docker-compose ps
# Verify backend is running
curl http://localhost:8000/health# Install dependencies
cd ~/projects/cryptify/frontend
npm install
# Start in background using screen
sudo apt install screen -y
screen -S frontend
npm run dev -- --host 0.0.0.0
# Press Ctrl+A, then D to detach
# Or use nohup
nohup npm run dev -- --host 0.0.0.0 > frontend.log 2>&1 &# Allow required ports
sudo ufw allow 22/tcp # SSH
sudo ufw allow 5173/tcp # Frontend
sudo ufw allow 8000/tcp # Backend API (optional)
# Enable firewall
sudo ufw enable
sudo ufw status# Collect historical data (10-30 minutes)
curl -X POST http://localhost:8000/ml/data-collector/run \
-H "Content-Type: application/json" \
-d '{"mode": "batch", "timeout": 3600}'
# Train models (30-60 minutes)
curl -X POST http://localhost:8000/ml/trainer/run \
-H "Content-Type: application/json" \
-d '{"mode": "batch", "timeout": 7200}'
# Generate predictions
curl -X POST http://localhost:8000/ml/predictor/run \
-H "Content-Type: application/json" \
-d '{"timeout": 300}'# Setup automated tasks
./scripts/setup_cron.sh
# Verify cron jobs
crontab -lAutomation Schedule:
- Data Collection: Every hour (at minute 0)
- Model Retraining: Every Sunday at 2:00 AM
After setup, the application will be available at:
- Frontend:
http://YOUR_SERVER_IP:5173 - Backend API:
http://YOUR_SERVER_IP:8000 - API Documentation:
http://YOUR_SERVER_IP:8000/docs
- Real-time Data Collection: Automated collection of BTC/USDT OHLCV data from Binance
- Feature Engineering: Technical indicators (RSI, MACD, ATR, etc.) and temporal features
- Multiple ML Models:
- Linear Regression
- XGBoost
- LSTM (Neural Network)
- Multiple Prediction Horizons: 6h, 12h, 24h ahead
- Interactive Charts: Real-time price charts with prediction overlays
- RESTful API: Comprehensive API for data access and ML operations
Backend environment variables (see backend/env.example):
DATABASE_URL: PostgreSQL connection stringAPI_HOST: API host (default: 0.0.0.0)API_PORT: API port (default: 8000)
Frontend environment variables (optional):
VITE_API_URL: Backend API URL (default:/api- uses proxy)
Default PostgreSQL credentials (configured in docker-compose.yml):
- Database:
criptify_db - User:
criptify_user - Password:
criptify_password - Port:
5432
GET /health- Health checkGET /ml/scripts/status/{script_name}- Check ML script status
GET /history- Get historical data and predictionsGET /features/latest- Get latest featuresGET /predictions/latest- Get latest predictions
POST /ml/data-collector/run- Run data collection- Body:
{"mode": "batch" | "incremental", "timeout": 3600}
- Body:
POST /ml/trainer/run- Train models- Body:
{"mode": "batch" | "retrain", "timeout": 7200}
- Body:
POST /ml/predictor/run- Generate predictions- Body:
{"timeout": 300}
- Body:
See full API documentation at http://localhost:8000/docs when backend is running.
# Run backend locally (without Docker)
cd backend
pip install -r requirements.txt
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000cd frontend
npm install
npm run devcd scripts
pip install -r requirements.txt
# Run data collection
python data_collector.py
# Train models
python multi_model_trainer.py
# Generate predictions
python predictor.py# Start all services
docker-compose up -d
# Stop all services
docker-compose down
# View logs
docker-compose logs -f [service_name]
# Rebuild containers
docker-compose build --no-cache
# Access PostgreSQL
docker-compose exec postgres psql -U criptify_user -d criptify_db
# Access backend container
docker-compose exec backend bash# Pull latest changes
git pull origin dev
# Restart services
docker-compose down
docker-compose build --no-cache
docker-compose up -d
# Restart frontend (if needed)
cd frontend
npm install # if package.json changed
# Restart using screen or nohup# Check container status
docker-compose ps
# View backend logs
docker-compose logs -f backend
# Check cron logs
tail -f logs/cron_data_collector.log
tail -f logs/cron_model_trainer.log
# Check resource usage
docker stats# Check backend health
curl http://localhost:8000/health
# Check script status
curl http://localhost:8000/ml/scripts/status/data_collector.py
# View recent logs
docker-compose logs --tail=50 backend
# Restart backend
docker-compose restart backend- Backend Guide: See
backend/BACKEND_GUIDE.md - ML Scripts API: See
backend/ML_SCRIPTS_API.md - Automation: See
scripts/README_AUTOMATION.md
See LICENSE file for details.
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
For issues and questions, please open an issue on GitHub.