You're a trader looking for arbitrage opportunities across DEX pools. The data is valuable, but you don't want bots scraping it for free.
This platform:
- Monitors DEX swaps on Ethereum mainnet (Uniswap V3 USDC/WETH pool)
- Detects price deltas that signal arbitrage opportunities
- Encrypts everything using Lighthouse's native encryption
- Gates access with ERC20 tokens - 1 DADC = 1 decrypt
- Burns tokens when you unlock data, creating real scarcity
Think of it as Bloomberg Terminal data, but encrypted on IPFS and unlocked with crypto tokens. Each decrypt costs 1 DADC token (destroyed forever), so 100 tokens = exactly 100 uses.
Get access in 30 seconds:
- Claim tokens β Open Faucet
- Connect wallet β Click "Connect to Web3"
- Call
claimTokens()β Get 100 DADC instantly - Visit app β dexarb-data-unlock.vercel.app
- Unlock data β Burns 1 DADC, shows arbitrage opportunities
Network: Sepolia Testnet
Cost: 1 DADC per decrypt (burned to 0xdead)
Claim limit: Once per wallet
Python Worker
βββ Polls Uniswap V3 swaps every 15 seconds (Blockscout API)
βββ Calculates arbitrage spreads (token price deltas)
βββ Packages as JSONL with timestamps
βββ Encrypts with Lighthouse SDK
βββ Sets access control: balanceOf(DADC) >= 1 token
βββ Auto-cleans old files (keeps only latest)
Key File: apps/worker/lighthouse_native_encryption.py
- Uses Lighthouse's Kavach encryption (not custom AES)
- Signs auth messages with eth-account
- Applies ERC20 access control via SDK
- Uploads to IPFS with distributed key shards
Next.js App
βββ Connects MetaMask wallet
βββ Checks DADC balance β Shows "X decrypts available"
βββ User clicks unlock β Burns 1 DADC to 0xdead
βββ Signs decryption request β Lighthouse checks new balance
βββ Lighthouse grants access β Downloads encrypted file
βββ Decrypts locally β Shows arbitrage data
Key File: frontend/app/page.tsx
- Implements
burnTokenForAccess()function - Transfers 1 DADC to burn address before each decrypt
- Uses Lighthouse SDK for decryption
- Updates UI to show remaining decrypts
DataCoin (ERC20)
βββ Symbol: DADC
βββ Supply: 100 billion
βββ Created via 1MB.io factory
βββ Used for access control
DataCoinFaucet
βββ Gives 100 DADC per wallet
βββ One-time claim (anti-spam)
βββ Instant minting
Deployment: Created through official 1MB.io factory with 10,000 LSDC lock
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β οΏ½ User Browser β
β βββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β Next.js Frontend (Vercel) β β
β β β’ Connect wallet β β
β β β’ Burn 1 DADC β 0xdead β β
β β β’ Sign decrypt request β β
β β β’ Download + decrypt locally β β
β βββββββββββββ¬ββββββββββββββββββββββββββββββββββββββββββββ β
ββββββββββββββββΌβββββββββββββββββββββββββββββββββββββββββββββββ
β
ββββββββββββΊ Sepolia Testnet
β βββ DataCoin.balanceOf() check
β
ββββββββββββΊ Lighthouse Storage
β βββ Check access control
β βββ Distribute key shards
β βββ Serve encrypted CID
β
ββββββββββββββββΌβββββββββββββββββββββββββββββββββββββββββββββββ
β βΌ β
β Python Worker (Railway) β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β 1. Poll Blockscout API (Uniswap V3 swaps) β β
β β 2. Transform to JSONL (arbitrage detection) β β
β β 3. Encrypt with Lighthouse native SDK β β
β β 4. Apply ERC20 gating (DADC balance >= 1) β β
β β 5. Upload to IPFS (distributed key shards) β β
β β 6. Cleanup old files (keep only latest) β β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β
ββββββββββββΊ Ethereum Mainnet RPC
β βββ Uniswap V3 swap events
β
ββββββββββββΊ Lighthouse API
βββ Upload encrypted files
Problem: How do you monetize data without letting bots scrape it once and share forever?
Solution: Burn tokens on every access.
User Journey:
1. Claim 100 DADC from faucet β Balance: 100 DADC
2. Unlock data (burn 1 DADC) β Balance: 99 DADC
3. Unlock again (burn 1 DADC) β Balance: 98 DADC
...
100. Last unlock β Balance: 0 DADC (no more access)
Why it works:
- Lighthouse checks
balanceOf(userAddress) >= 1 DADCbefore each decrypt - Frontend voluntarily burns 1 DADC before requesting access
- User's balance decreases β Lighthouse naturally denies access when balance hits 0
- No backend burn logic needed (Lighthouse is read-only)
- Creates real scarcity (deflationary model)
Live burn transactions: View on Etherscan
| Layer | Technology | Why |
|---|---|---|
| Frontend | Next.js 14 + ethers.js v6 | React for UI, ethers for wallet interactions |
| Encryption | Lighthouse SDK v0.3.3 | Native Kavach encryption with BLS threshold crypto |
| Backend | Python 3.12 + aiohttp | Async worker for high-throughput swap ingestion |
| Blockchain Data | Blockscout MCP Server | Structured API for Uniswap V3 swap events |
| Smart Contracts | Solidity 0.8.20 | ERC20 (DataCoin) + Faucet on Sepolia |
| Deployment | Railway + Vercel | Backend on Railway, frontend on Vercel |
| State | JSON files | Checkpoints, deduplication, price buffers |
af_hosted/
βββ apps/
β βββ worker/ # Backend (Railway)
β βββ run.py # Main loop: fetch β transform β encrypt β upload
β βββ lighthouse_native_encryption.py # Lighthouse SDK wrapper
β βββ blockscout_client.py # Blockscout MCP integration
β βββ transform.py # DEX event β arbitrage JSONL
β βββ http_server.py # Health endpoint
β βββ settings.py # Config from env vars
β
βββ frontend/ # Frontend (Vercel)
β βββ app/
β β βββ page.tsx # Main unlock page (burn + decrypt)
β β βββ layout.tsx # Next.js layout
β β βββ globals.css # Tailwind styles
β βββ next.config.js # Next.js config
β βββ package.json # npm dependencies
β
βββ contracts/
β βββ DataCoinFaucet.sol # One-time claim faucet
β
βββ scripts/
β βββ createDEXArbDataCoin.js # 1MB.io factory deployment
β βββ deployFaucet.js # Faucet deployment
β βββ verify_lighthouse_protection.py # Test access control
β
βββ state/ # Worker state (gitignored)
β βββ last_block.json # Last processed block
β βββ dedupe.json # Prevent duplicate swaps
β βββ price_buffer.json # 24h price rolling window
β
βββ requirements.txt # Python deps
βββ package.json # Root npm deps (DataCoin creation)
βββ nixpacks.toml # Railway build config
βββ Procfile # Railway start command
Address: 0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC
Network: Sepolia Testnet
Standard: ERC20 (1MB.io DataCoin)
Supply: 100,000,000,000 DADC (100 billion)
Decimals: 18Deployment:
- Created via official 1MB.io factory
- Locked 10,000 LSDC (Lighthouse Sepolia Data Coin)
- 99% supply available for minting
- 1% in Uniswap liquidity pool
Address: 0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB
Function: claimTokens() β Mints 100 DADC
Limit: One claim per address- Node.js 18+ (for Lighthouse SDK)
- Python 3.12+
- MetaMask wallet on Sepolia
cd apps/worker
# Install dependencies
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
# Install Lighthouse CLI (required for cleanup)
npm install -g @lighthouse-web3/sdk
# Configure environment
export BLOCKSCOUT_MCP_BASE="https://eth-sepolia.blockscout.com/api/v2"
export CHAIN_ID=11155111
export LIGHTHOUSE_API_KEY="your_key"
export LIGHTHOUSE_WALLET_PRIVATE_KEY="0x..."
export DATACOIN_CONTRACT_ADDRESS="0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC"
# Run worker
python run.pyExpected output:
β Blockscout MCP client initialized
β Lighthouse encryption configured
β HTTP server started on http://0.0.0.0:8787
β Fetched 47 swaps from Uniswap V3 USDC/WETH
β Lighthouse upload successful: QmXXX...
β Auto-cleanup: deleted 3 old files
cd frontend
# Install dependencies
npm install
# Configure environment
export NEXT_PUBLIC_DATACOIN_ADDRESS="0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC"
export NEXT_PUBLIC_FAUCET_ADDRESS="0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB"
export NEXT_PUBLIC_CHAIN_ID=11155111
export NEXT_PUBLIC_METADATA_API="http://localhost:8787"
# Start dev server
npm run devVisit: http://localhost:3000
- Connect repo to Railway
- Set env vars:
LIGHTHOUSE_API_KEY=... LIGHTHOUSE_WALLET_PRIVATE_KEY=0x... BLOCKSCOUT_MCP_BASE=https://eth-sepolia.blockscout.com/api/v2 CHAIN_ID=11155111 DATACOIN_CONTRACT_ADDRESS=0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC - Deploy β Railway auto-detects nixpacks.toml
- Verify β Check logs for "β Lighthouse upload successful"
- Connect repo to Vercel
- Set env vars:
NEXT_PUBLIC_DATACOIN_ADDRESS=0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC NEXT_PUBLIC_FAUCET_ADDRESS=0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB NEXT_PUBLIC_CHAIN_ID=11155111 - Deploy β Auto-deploys from main branch
- Visit β https://dexarb-data-unlock.vercel.app
- Uses Lighthouse's Kavach encryption (threshold BLS cryptography)
- Distributes key shards across 5 nodes
- Checks ERC20 balance on every decrypt attempt
- No backend decryption logic needed
Code: apps/worker/lighthouse_native_encryption.py
- Frontend burns 1 DADC to
0xdeadbefore each decrypt - Lighthouse checks updated balance
- When balance = 0, access denied
- Deflationary (tokens destroyed forever)
Code: frontend/app/page.tsx β burnTokenForAccess()
- Deletes old encrypted files after each upload
- Keeps only latest file (saves storage costs)
- Uses Lighthouse CLI for reliable deletion
Code: apps/worker/lighthouse_cleanup.py
- Tracks 24h rolling window of token prices
- Calculates min/max/mean for each token
- Highlights profitable spreads
- Persists state across restarts
Code: apps/worker/transform.py
# Test without tokens (should fail)
python scripts/verify_lighthouse_protection.py \
--cid QmXXX... \
--wallet 0xWithoutTokens
# Output: β Access denied (balance = 0)
# Claim tokens from faucet
cast send 0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB \
"claimTokens()" \
--rpc-url https://ethereum-sepolia-rpc.publicnode.com \
--private-key 0x...
# Test with tokens (should succeed)
python scripts/verify_lighthouse_protection.py \
--cid QmXXX... \
--wallet 0xWithTokens
# Output: β
Access granted (balance = 100 DADC)- BURN_IMPLEMENTATION_COMPLETE.md - Token burning implementation details
- LIGHTHOUSE_NATIVE_ENCRYPTION_REFACTOR_PLAN.md - Encryption architecture
- DATACOIN_DEPLOYMENT.md - Smart contract deployment guide
- RAILWAY_ENV_VARS_REQUIRED.md - Environment variable reference
The Problem: On-chain data is either:
- Free β Anyone can scrape it (no monetization)
- Centralized β Behind API keys/paywalls (censorship risk)
This Solution:
- Data lives on IPFS (decentralized, permanent)
- Encrypted with threshold cryptography (no single point of failure)
- Access gated by ERC20 tokens (programmable economics)
- Tokens burned on use (creates scarcity)
Result: True data markets on-chain. No middleman, no censorship, pay-per-use.
- Lighthouse - Decentralized encrypted storage (ETHOnline Sponsor)
- 1MB.io - DataCoin creation platform
- Blockscout - Blockchain data API
- Railway - Backend deployment
- Vercel - Frontend deployment
- ethers.js - Ethereum interactions
MIT - See LICENSE for details
Made for ETHOnline 2025
- π Real-Time Monitoring: Ingests swap events from Base, Ethereum, Polygon DEX pools every 5 minutes
- π Arbitrage Detection: Identifies profitable price deltas across DEX pairs using rolling 24h windows
- π Encrypted Storage: Publishes datasets to Lighthouse with automatic old file cleanup (maintains single latest file)
- π Block Explorer Integration: Every transaction links to Autoscout for deep inspection
- π¬ AI Agent Interface: ASI:One chat protocol for conversational data access (future)
- π° Data Monetization: Package and sell curated datasets via Lighthouse DataCoin (future)
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β π Railway Platform β
β ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ β
β β Python Worker (apps/worker) β β
β β β’ Polls DEX events every 5 min via Blockscout MCP β β
β β β’ Transforms to JSONL with arbitrage detection β β
β β β’ Encrypts & uploads to Lighthouse β β
β β β’ Auto-deletes old files (keeps only latest) β β
β β β’ Maintains state: checkpoints, deduplication β β
β βββββββββββββββ¬βββββββββββββββββββββββββββββββββββββββββββ β
ββββββββββββββββββΌβββββββββββββββββββββββββββββββββββββββββββββββ
β
ββββββββββββββΌβββββββββββββ
βΌ βΌ βΌ
βββββββββββ βββββββββββ ββββββββββββββββ
βBlockscoutβ βChainlinkβ β Lighthouse β
β MCP β β Price β β Storage β
β Server β β Feeds β β (Encrypted) β
βββββββββββ βββββββββββ ββββββββββββββββ
β β β
βΌ βΌ βΌ
βββββββββββββββββββββββββββββββββββββββ
β Base/ETH/Polygon Mainnet (RPC) β
βββββββββββββββββββββββββββββββββββββββ
Production Features:
- β Auto-Cleanup: Maintains only 1 file on Lighthouse (deletes old uploads automatically)
- β Rolling Window: 24-hour price tracking for accurate arbitrage detection
- β State Persistence: Checkpoints, deduplication, price buffers survive restarts
- β Dual API Failover: Primary/fallback endpoint switching on errors
- β
HTTP Health Endpoint:
/healthfor Railway monitoring - β Multi-Chain: Base, Ethereum, Polygon support
- Railway account (railway.app)
- Lighthouse API key (files.lighthouse.storage)
- Blockscout API keys for Base, Ethereum, Polygon
- Chainlink price feed contracts (optional, uses fallback if not set)
-
Connect Repository
railway link
-
Set Environment Variables (in Railway dashboard or CLI)
# Core Configuration BLOCKSCOUT_BASE_URL=https://base.blockscout.com BLOCKSCOUT_BASE_API_KEY=your_base_key BLOCKSCOUT_ETH_URL=https://eth.blockscout.com BLOCKSCOUT_ETH_API_KEY=your_eth_key BLOCKSCOUT_POLYGON_URL=https://polygon.blockscout.com BLOCKSCOUT_POLYGON_API_KEY=your_polygon_key # Lighthouse Storage (auto-cleanup enabled) LIGHTHOUSE_API_KEY=your_lighthouse_key # Optional: Chainlink Price Feeds (uses fallback if not set) CHAINLINK_ETH_USD=0x5f4eC3Df9cbd43714FE2740f5E3616155c5b8419 CHAINLINK_USDC_USD=0x8fFfFfd4AfB6115b954Bd326cbe7B4BA576818f6
-
Deploy
railway up
-
Verify
- Check Railway logs for "β Lighthouse upload successful"
- Visit
/healthendpoint to confirm worker is running - Check Lighthouse dashboard - should see only 1 encrypted file (auto-cleanup working)
# Setup
python -m venv venv
source venv/bin/activate # or `venv\Scripts\activate` on Windows
pip install -r requirements.txt
# Configure
cp .env.example .env
# Edit .env with your API keys
# Run worker
python apps/worker/run.py
# Test HTTP endpoint
curl http://localhost:8787/healthaf_hosted/
βββ apps/
β βββ worker/ # π Core ingestion worker
β β βββ run.py # Main entry point
β β βββ blockscout_client.py # MCP server integration
β β βββ chainlink_price.py # Price feed oracle
β β βββ transform.py # JSONL transformation
β β βββ http_server.py # Health endpoint
β β βββ lighthouse_cleanup.py # Auto file cleanup (NEW)
β β βββ state/ # Checkpoints & deduplication
β βββ hosted-agent/ # π¬ Future: ASI:One chat interface
βββ infra/
β βββ autoscout/ # π Block explorer config
β βββ instance.json # Explorer URLs
βββ scripts/
β βββ run_worker.sh # Local development runner
β βββ verify_demo.py # Demo data generator
βββ state/ # πΎ Persistent state (gitignored)
β βββ last_block.json
β βββ dedupe.json
β βββ price_buffer.json
β βββ block_ts.json
βββ nixpacks.toml # Railway build configuration
βββ Procfile # Railway startup command
βββ requirements.txt # Python dependencies
- Automatic: Runs after every successful upload
- Efficient: Deletes all old files, keeps only latest
- Smart: Uses Lighthouse CLI for reliable deletion
- Fast: Parallel deletion with progress tracking
- Production-Ready: Full error handling & logging
# Happens automatically after each upload
cleanup_lighthouse_storage(
api_key=LIGHTHOUSE_API_KEY,
protected_cid="<latest_file_cid>", # Don't delete this one
dry_run=False # Delete for real
)
# Result: Only 1 file on Lighthouse β
- 24-hour price history per token
- Accurate min/max/mean calculations
- Automatic expiry of old data
- Persistent across restarts
- Primary endpoint:
api.lighthouse.storage - Fallback endpoint:
upload.lighthouse.storage - Automatic failover on errors
- SDK upload with REST fallback
- Real-time price delta calculation
- Multi-DEX comparison (Uniswap, PancakeSwap, SushiSwap)
- Profit opportunity highlighting
- Historical trend analysis
| Component | Technology | Purpose |
|---|---|---|
| Platform | Railway | PaaS deployment with automatic builds |
| Language | Python 3.12 | Core worker implementation |
| Build | Nixpacks | Automatic dependency detection |
| Storage | Lighthouse | Encrypted decentralized file storage |
| Blockchain | Blockscout MCP | Chain data access layer |
| Oracle | Chainlink | Price feed verification |
| Explorer | Autoscout | Transaction inspection UI |
| Feature | Status | Details |
|---|---|---|
| Worker Deployment | β Live | Running on Railway (asia-southeast1) |
| Lighthouse Upload | β Working | Encrypted JSONL every 5 minutes |
| Auto-Cleanup | β Working | Maintains 1 file only |
| Rolling Window | β Working | 24h price tracking |
| Multi-Chain | β Working | Base + ETH + Polygon |
| HTTP Health | β Working | /health endpoint on port 8787 |
| State Persistence | β Working | Survives restarts |
| ASI:One Agent | π§ Future | Conversational interface planned |
| DataCoin Sales | π§ Future | Dataset monetization planned |
- β Fixed npm global bin PATH in Nix environment
- β
Added
[variables]section to nixpacks.toml for runtime PATH - β Lighthouse CLI now available to Python subprocess calls
- β Auto-cleanup working in production
- β
Fixed missing
from lighthouse_cleanup import cleanup_lighthouse_storage - β Added proper error handling for CLI availability
- β Added setup verification on startup
| Contract | Address | Purpose |
|---|---|---|
| DataCoin (DADC) | 0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC |
Token-gated access to premium data |
| Faucet | 0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB |
Judges claim tokens here |
| Liquidity Pool | 0x8EF4B1670D382b47DBbF30ebE2Bb15e52Ed2236c |
DADC/LSDC pool |
Token Details:
- Name: DEXArb Data Coin
- Symbol: DADC
- Decimals: 18
- Total Supply: 100,000,000,000 DADC
- Metadata: ipfs://bafkreie73wguaf7yucgzcudbkivtgtxzvyv2efjg24s76j67lu7cbt7vcy
Deployment Info:
- Factory: Official 1MB.io factory (
0xC7Bc3432B0CcfeFb4237172340Cd8935f95f2990) - Lock Asset: 10,000 LSDC
- Creation Tx:
0x0bf7c4da8b9b05137d08af046d8d360863398f91d747b2d4ac3f5c4bafe235ac
- Base Network
- Chainlink Price Feeds
- Uniswap V3
- 1MB.io Platform - DataCoin creation platform
This is a production system. For contributions:
- Test locally first with
python apps/worker/run.py - Verify Lighthouse upload works
- Check auto-cleanup deletes old files
- Ensure health endpoint responds
- Submit PR with detailed testing notes
Proprietary - All Rights Reserved