Skip to content

shreyas-sovani/Alpha-Foundry

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

145 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

DADC Logo

DEXArb Intelligence Platform

Real-time MEV opportunities, encrypted on-chain, unlocked with tokens.

ETHOnline 2025 Lighthouse Live on Railway Sepolia

🎯 Try Demo β€’ πŸ“œ Contracts β€’ πŸ”₯ Get Tokens


What This Actually Does

You're a trader looking for arbitrage opportunities across DEX pools. The data is valuable, but you don't want bots scraping it for free.

This platform:

  1. Monitors DEX swaps on Ethereum mainnet (Uniswap V3 USDC/WETH pool)
  2. Detects price deltas that signal arbitrage opportunities
  3. Encrypts everything using Lighthouse's native encryption
  4. Gates access with ERC20 tokens - 1 DADC = 1 decrypt
  5. Burns tokens when you unlock data, creating real scarcity

Think of it as Bloomberg Terminal data, but encrypted on IPFS and unlocked with crypto tokens. Each decrypt costs 1 DADC token (destroyed forever), so 100 tokens = exactly 100 uses.


🎫 For ETHOnline Judges

Get access in 30 seconds:

  1. Claim tokens β†’ Open Faucet
  2. Connect wallet β†’ Click "Connect to Web3"
  3. Call claimTokens() β†’ Get 100 DADC instantly
  4. Visit app β†’ dexarb-data-unlock.vercel.app
  5. Unlock data β†’ Burns 1 DADC, shows arbitrage opportunities

Network: Sepolia Testnet
Cost: 1 DADC per decrypt (burned to 0xdead)
Claim limit: Once per wallet


How It Works

Backend (Railway)

Python Worker
β”œβ”€β”€ Polls Uniswap V3 swaps every 15 seconds (Blockscout API)
β”œβ”€β”€ Calculates arbitrage spreads (token price deltas)
β”œβ”€β”€ Packages as JSONL with timestamps
β”œβ”€β”€ Encrypts with Lighthouse SDK
β”œβ”€β”€ Sets access control: balanceOf(DADC) >= 1 token
└── Auto-cleans old files (keeps only latest)

Key File: apps/worker/lighthouse_native_encryption.py

  • Uses Lighthouse's Kavach encryption (not custom AES)
  • Signs auth messages with eth-account
  • Applies ERC20 access control via SDK
  • Uploads to IPFS with distributed key shards

Frontend (Vercel)

Next.js App
β”œβ”€β”€ Connects MetaMask wallet
β”œβ”€β”€ Checks DADC balance β†’ Shows "X decrypts available"
β”œβ”€β”€ User clicks unlock β†’ Burns 1 DADC to 0xdead
β”œβ”€β”€ Signs decryption request β†’ Lighthouse checks new balance
β”œβ”€β”€ Lighthouse grants access β†’ Downloads encrypted file
└── Decrypts locally β†’ Shows arbitrage data

Key File: frontend/app/page.tsx

  • Implements burnTokenForAccess() function
  • Transfers 1 DADC to burn address before each decrypt
  • Uses Lighthouse SDK for decryption
  • Updates UI to show remaining decrypts

Smart Contracts (Sepolia)

DataCoin (ERC20)
β”œβ”€β”€ Symbol: DADC
β”œβ”€β”€ Supply: 100 billion
β”œβ”€β”€ Created via 1MB.io factory
└── Used for access control

DataCoinFaucet
β”œβ”€β”€ Gives 100 DADC per wallet
β”œβ”€β”€ One-time claim (anti-spam)
└── Instant minting

Deployment: Created through official 1MB.io factory with 10,000 LSDC lock


Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    οΏ½ User Browser                          β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚         Next.js Frontend (Vercel)                     β”‚  β”‚
β”‚  β”‚  β€’ Connect wallet                                     β”‚  β”‚
β”‚  β”‚  β€’ Burn 1 DADC β†’ 0xdead                              β”‚  β”‚
β”‚  β”‚  β€’ Sign decrypt request                               β”‚  β”‚
β”‚  β”‚  β€’ Download + decrypt locally                         β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
               β”‚
               β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Ί Sepolia Testnet
               β”‚             └── DataCoin.balanceOf() check
               β”‚
               β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Ί Lighthouse Storage
               β”‚             β”œβ”€β”€ Check access control
               β”‚             β”œβ”€β”€ Distribute key shards
               β”‚             └── Serve encrypted CID
               β”‚
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚              β–Ό                                               β”‚
β”‚      Python Worker (Railway)                                 β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”     β”‚
β”‚  β”‚  1. Poll Blockscout API (Uniswap V3 swaps)         β”‚     β”‚
β”‚  β”‚  2. Transform to JSONL (arbitrage detection)       β”‚     β”‚
β”‚  β”‚  3. Encrypt with Lighthouse native SDK             β”‚     β”‚
β”‚  β”‚  4. Apply ERC20 gating (DADC balance >= 1)         β”‚     β”‚
β”‚  β”‚  5. Upload to IPFS (distributed key shards)        β”‚     β”‚
β”‚  β”‚  6. Cleanup old files (keep only latest)           β”‚     β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
               β”‚
               β”œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β–Ί Ethereum Mainnet RPC
               β”‚             └── Uniswap V3 swap events
               β”‚
               └──────────► Lighthouse API
                             └── Upload encrypted files

Token Economics (The Cool Part)

Problem: How do you monetize data without letting bots scrape it once and share forever?

Solution: Burn tokens on every access.

User Journey:
1. Claim 100 DADC from faucet β†’ Balance: 100 DADC
2. Unlock data (burn 1 DADC) β†’ Balance: 99 DADC
3. Unlock again (burn 1 DADC) β†’ Balance: 98 DADC
...
100. Last unlock β†’ Balance: 0 DADC (no more access)

Why it works:

  • Lighthouse checks balanceOf(userAddress) >= 1 DADC before each decrypt
  • Frontend voluntarily burns 1 DADC before requesting access
  • User's balance decreases β†’ Lighthouse naturally denies access when balance hits 0
  • No backend burn logic needed (Lighthouse is read-only)
  • Creates real scarcity (deflationary model)

Live burn transactions: View on Etherscan


Tech Stack

Layer Technology Why
Frontend Next.js 14 + ethers.js v6 React for UI, ethers for wallet interactions
Encryption Lighthouse SDK v0.3.3 Native Kavach encryption with BLS threshold crypto
Backend Python 3.12 + aiohttp Async worker for high-throughput swap ingestion
Blockchain Data Blockscout MCP Server Structured API for Uniswap V3 swap events
Smart Contracts Solidity 0.8.20 ERC20 (DataCoin) + Faucet on Sepolia
Deployment Railway + Vercel Backend on Railway, frontend on Vercel
State JSON files Checkpoints, deduplication, price buffers

Project Structure

af_hosted/
β”œβ”€β”€ apps/
β”‚   └── worker/                         # Backend (Railway)
β”‚       β”œβ”€β”€ run.py                      # Main loop: fetch β†’ transform β†’ encrypt β†’ upload
β”‚       β”œβ”€β”€ lighthouse_native_encryption.py  # Lighthouse SDK wrapper
β”‚       β”œβ”€β”€ blockscout_client.py        # Blockscout MCP integration
β”‚       β”œβ”€β”€ transform.py                # DEX event β†’ arbitrage JSONL
β”‚       β”œβ”€β”€ http_server.py              # Health endpoint
β”‚       └── settings.py                 # Config from env vars
β”‚
β”œβ”€β”€ frontend/                           # Frontend (Vercel)
β”‚   β”œβ”€β”€ app/
β”‚   β”‚   β”œβ”€β”€ page.tsx                    # Main unlock page (burn + decrypt)
β”‚   β”‚   β”œβ”€β”€ layout.tsx                  # Next.js layout
β”‚   β”‚   └── globals.css                 # Tailwind styles
β”‚   β”œβ”€β”€ next.config.js                  # Next.js config
β”‚   └── package.json                    # npm dependencies
β”‚
β”œβ”€β”€ contracts/
β”‚   └── DataCoinFaucet.sol              # One-time claim faucet
β”‚
β”œβ”€β”€ scripts/
β”‚   β”œβ”€β”€ createDEXArbDataCoin.js         # 1MB.io factory deployment
β”‚   β”œβ”€β”€ deployFaucet.js                 # Faucet deployment
β”‚   └── verify_lighthouse_protection.py # Test access control
β”‚
β”œβ”€β”€ state/                              # Worker state (gitignored)
β”‚   β”œβ”€β”€ last_block.json                 # Last processed block
β”‚   β”œβ”€β”€ dedupe.json                     # Prevent duplicate swaps
β”‚   └── price_buffer.json               # 24h price rolling window
β”‚
β”œβ”€β”€ requirements.txt                    # Python deps
β”œβ”€β”€ package.json                        # Root npm deps (DataCoin creation)
β”œβ”€β”€ nixpacks.toml                       # Railway build config
└── Procfile                            # Railway start command

Smart Contracts

DataCoin (DADC)

Address: 0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC
Network: Sepolia Testnet
Standard: ERC20 (1MB.io DataCoin)
Supply: 100,000,000,000 DADC (100 billion)
Decimals: 18

Deployment:

  • Created via official 1MB.io factory
  • Locked 10,000 LSDC (Lighthouse Sepolia Data Coin)
  • 99% supply available for minting
  • 1% in Uniswap liquidity pool

πŸ“œ View on Etherscan

DataCoinFaucet

Address: 0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB
Function: claimTokens() β†’ Mints 100 DADC
Limit: One claim per address

πŸ“œ View on Etherscan


Running Locally

Prerequisites

  • Node.js 18+ (for Lighthouse SDK)
  • Python 3.12+
  • MetaMask wallet on Sepolia

Backend (Worker)

cd apps/worker

# Install dependencies
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt

# Install Lighthouse CLI (required for cleanup)
npm install -g @lighthouse-web3/sdk

# Configure environment
export BLOCKSCOUT_MCP_BASE="https://eth-sepolia.blockscout.com/api/v2"
export CHAIN_ID=11155111
export LIGHTHOUSE_API_KEY="your_key"
export LIGHTHOUSE_WALLET_PRIVATE_KEY="0x..."
export DATACOIN_CONTRACT_ADDRESS="0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC"

# Run worker
python run.py

Expected output:

βœ“ Blockscout MCP client initialized
βœ“ Lighthouse encryption configured
βœ“ HTTP server started on http://0.0.0.0:8787
βœ“ Fetched 47 swaps from Uniswap V3 USDC/WETH
βœ“ Lighthouse upload successful: QmXXX...
βœ“ Auto-cleanup: deleted 3 old files

Frontend

cd frontend

# Install dependencies
npm install

# Configure environment
export NEXT_PUBLIC_DATACOIN_ADDRESS="0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC"
export NEXT_PUBLIC_FAUCET_ADDRESS="0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB"
export NEXT_PUBLIC_CHAIN_ID=11155111
export NEXT_PUBLIC_METADATA_API="http://localhost:8787"

# Start dev server
npm run dev

Visit: http://localhost:3000


Deployment

Backend (Railway)

  1. Connect repo to Railway
  2. Set env vars:
    LIGHTHOUSE_API_KEY=...
    LIGHTHOUSE_WALLET_PRIVATE_KEY=0x...
    BLOCKSCOUT_MCP_BASE=https://eth-sepolia.blockscout.com/api/v2
    CHAIN_ID=11155111
    DATACOIN_CONTRACT_ADDRESS=0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC
    
  3. Deploy β†’ Railway auto-detects nixpacks.toml
  4. Verify β†’ Check logs for "βœ… Lighthouse upload successful"

Frontend (Vercel)

  1. Connect repo to Vercel
  2. Set env vars:
    NEXT_PUBLIC_DATACOIN_ADDRESS=0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC
    NEXT_PUBLIC_FAUCET_ADDRESS=0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB
    NEXT_PUBLIC_CHAIN_ID=11155111
    
  3. Deploy β†’ Auto-deploys from main branch
  4. Visit β†’ https://dexarb-data-unlock.vercel.app

Key Features Explained

πŸ” Lighthouse Native Encryption

  • Uses Lighthouse's Kavach encryption (threshold BLS cryptography)
  • Distributes key shards across 5 nodes
  • Checks ERC20 balance on every decrypt attempt
  • No backend decryption logic needed

Code: apps/worker/lighthouse_native_encryption.py

πŸ”₯ Token Burning Mechanism

  • Frontend burns 1 DADC to 0xdead before each decrypt
  • Lighthouse checks updated balance
  • When balance = 0, access denied
  • Deflationary (tokens destroyed forever)

Code: frontend/app/page.tsx β†’ burnTokenForAccess()

🧹 Auto-Cleanup

  • Deletes old encrypted files after each upload
  • Keeps only latest file (saves storage costs)
  • Uses Lighthouse CLI for reliable deletion

Code: apps/worker/lighthouse_cleanup.py

πŸ“Š Arbitrage Detection

  • Tracks 24h rolling window of token prices
  • Calculates min/max/mean for each token
  • Highlights profitable spreads
  • Persists state across restarts

Code: apps/worker/transform.py


Testing Access Control

# Test without tokens (should fail)
python scripts/verify_lighthouse_protection.py \
  --cid QmXXX... \
  --wallet 0xWithoutTokens

# Output: ❌ Access denied (balance = 0)

# Claim tokens from faucet
cast send 0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB \
  "claimTokens()" \
  --rpc-url https://ethereum-sepolia-rpc.publicnode.com \
  --private-key 0x...

# Test with tokens (should succeed)
python scripts/verify_lighthouse_protection.py \
  --cid QmXXX... \
  --wallet 0xWithTokens

# Output: βœ… Access granted (balance = 100 DADC)

Documentation


Why This Matters

The Problem: On-chain data is either:

  1. Free β†’ Anyone can scrape it (no monetization)
  2. Centralized β†’ Behind API keys/paywalls (censorship risk)

This Solution:

  • Data lives on IPFS (decentralized, permanent)
  • Encrypted with threshold cryptography (no single point of failure)
  • Access gated by ERC20 tokens (programmable economics)
  • Tokens burned on use (creates scarcity)

Result: True data markets on-chain. No middleman, no censorship, pay-per-use.


Built With


License

MIT - See LICENSE for details


Made for ETHOnline 2025

🎯 Try Demo β€’ πŸ“œ Contracts β€’ πŸ”₯ Get Tokens

🎯 What It Does

  • πŸ“Š Real-Time Monitoring: Ingests swap events from Base, Ethereum, Polygon DEX pools every 5 minutes
  • πŸ’Ž Arbitrage Detection: Identifies profitable price deltas across DEX pairs using rolling 24h windows
  • πŸ” Encrypted Storage: Publishes datasets to Lighthouse with automatic old file cleanup (maintains single latest file)
  • πŸ” Block Explorer Integration: Every transaction links to Autoscout for deep inspection
  • πŸ’¬ AI Agent Interface: ASI:One chat protocol for conversational data access (future)
  • πŸ’° Data Monetization: Package and sell curated datasets via Lighthouse DataCoin (future)

πŸ—οΈ Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚                    🌐 Railway Platform                        β”‚
β”‚  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”‚
β”‚  β”‚              Python Worker (apps/worker)                β”‚  β”‚
β”‚  β”‚  β€’ Polls DEX events every 5 min via Blockscout MCP     β”‚  β”‚
β”‚  β”‚  β€’ Transforms to JSONL with arbitrage detection        β”‚  β”‚
β”‚  β”‚  β€’ Encrypts & uploads to Lighthouse                    β”‚  β”‚
β”‚  β”‚  β€’ Auto-deletes old files (keeps only latest)          β”‚  β”‚
β”‚  β”‚  β€’ Maintains state: checkpoints, deduplication         β”‚  β”‚
β”‚  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
                 β”‚
    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”Όβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
    β–Ό            β–Ό            β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”  β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚Blockscoutβ”‚  β”‚Chainlinkβ”‚  β”‚ Lighthouse   β”‚
β”‚   MCP    β”‚  β”‚  Price  β”‚  β”‚   Storage    β”‚
β”‚  Server  β”‚  β”‚  Feeds  β”‚  β”‚ (Encrypted)  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
     β”‚            β”‚              β”‚
     β–Ό            β–Ό              β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  Base/ETH/Polygon Mainnet (RPC)    β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

Production Features:

  • βœ… Auto-Cleanup: Maintains only 1 file on Lighthouse (deletes old uploads automatically)
  • βœ… Rolling Window: 24-hour price tracking for accurate arbitrage detection
  • βœ… State Persistence: Checkpoints, deduplication, price buffers survive restarts
  • βœ… Dual API Failover: Primary/fallback endpoint switching on errors
  • βœ… HTTP Health Endpoint: /health for Railway monitoring
  • βœ… Multi-Chain: Base, Ethereum, Polygon support

πŸš€ Quick Start (Railway Deployment)

Prerequisites

  • Railway account (railway.app)
  • Lighthouse API key (files.lighthouse.storage)
  • Blockscout API keys for Base, Ethereum, Polygon
  • Chainlink price feed contracts (optional, uses fallback if not set)

Deploy to Railway

  1. Connect Repository

    railway link
  2. Set Environment Variables (in Railway dashboard or CLI)

    # Core Configuration
    BLOCKSCOUT_BASE_URL=https://base.blockscout.com
    BLOCKSCOUT_BASE_API_KEY=your_base_key
    BLOCKSCOUT_ETH_URL=https://eth.blockscout.com
    BLOCKSCOUT_ETH_API_KEY=your_eth_key
    BLOCKSCOUT_POLYGON_URL=https://polygon.blockscout.com
    BLOCKSCOUT_POLYGON_API_KEY=your_polygon_key
    
    # Lighthouse Storage (auto-cleanup enabled)
    LIGHTHOUSE_API_KEY=your_lighthouse_key
    
    # Optional: Chainlink Price Feeds (uses fallback if not set)
    CHAINLINK_ETH_USD=0x5f4eC3Df9cbd43714FE2740f5E3616155c5b8419
    CHAINLINK_USDC_USD=0x8fFfFfd4AfB6115b954Bd326cbe7B4BA576818f6
  3. Deploy

    railway up
  4. Verify

    • Check Railway logs for "βœ… Lighthouse upload successful"
    • Visit /health endpoint to confirm worker is running
    • Check Lighthouse dashboard - should see only 1 encrypted file (auto-cleanup working)

Local Development

# Setup
python -m venv venv
source venv/bin/activate  # or `venv\Scripts\activate` on Windows
pip install -r requirements.txt

# Configure
cp .env.example .env
# Edit .env with your API keys

# Run worker
python apps/worker/run.py

# Test HTTP endpoint
curl http://localhost:8787/health

πŸ“ Project Structure

af_hosted/
β”œβ”€β”€ apps/
β”‚   β”œβ”€β”€ worker/                    # πŸ”„ Core ingestion worker
β”‚   β”‚   β”œβ”€β”€ run.py                 # Main entry point
β”‚   β”‚   β”œβ”€β”€ blockscout_client.py   # MCP server integration
β”‚   β”‚   β”œβ”€β”€ chainlink_price.py     # Price feed oracle
β”‚   β”‚   β”œβ”€β”€ transform.py           # JSONL transformation
β”‚   β”‚   β”œβ”€β”€ http_server.py         # Health endpoint
β”‚   β”‚   β”œβ”€β”€ lighthouse_cleanup.py  # Auto file cleanup (NEW)
β”‚   β”‚   └── state/                 # Checkpoints & deduplication
β”‚   └── hosted-agent/              # πŸ’¬ Future: ASI:One chat interface
β”œβ”€β”€ infra/
β”‚   └── autoscout/                 # πŸ” Block explorer config
β”‚       └── instance.json          # Explorer URLs
β”œβ”€β”€ scripts/
β”‚   β”œβ”€β”€ run_worker.sh              # Local development runner
β”‚   └── verify_demo.py             # Demo data generator
β”œβ”€β”€ state/                         # πŸ’Ύ Persistent state (gitignored)
β”‚   β”œβ”€β”€ last_block.json
β”‚   β”œβ”€β”€ dedupe.json
β”‚   β”œβ”€β”€ price_buffer.json
β”‚   └── block_ts.json
β”œβ”€β”€ nixpacks.toml                  # Railway build configuration
β”œβ”€β”€ Procfile                       # Railway startup command
└── requirements.txt               # Python dependencies

πŸ”‘ Key Features

✨ Lighthouse Auto-Cleanup (NEW)

  • Automatic: Runs after every successful upload
  • Efficient: Deletes all old files, keeps only latest
  • Smart: Uses Lighthouse CLI for reliable deletion
  • Fast: Parallel deletion with progress tracking
  • Production-Ready: Full error handling & logging
# Happens automatically after each upload
cleanup_lighthouse_storage(
    api_key=LIGHTHOUSE_API_KEY,
    protected_cid="<latest_file_cid>",  # Don't delete this one
    dry_run=False                        # Delete for real
)
# Result: Only 1 file on Lighthouse βœ…

πŸ“Š Rolling Window Price Tracking

  • 24-hour price history per token
  • Accurate min/max/mean calculations
  • Automatic expiry of old data
  • Persistent across restarts

πŸ”„ Dual API Architecture

  • Primary endpoint: api.lighthouse.storage
  • Fallback endpoint: upload.lighthouse.storage
  • Automatic failover on errors
  • SDK upload with REST fallback

🎯 Arbitrage Detection

  • Real-time price delta calculation
  • Multi-DEX comparison (Uniswap, PancakeSwap, SushiSwap)
  • Profit opportunity highlighting
  • Historical trend analysis

πŸ› οΈ Technology Stack

Component Technology Purpose
Platform Railway PaaS deployment with automatic builds
Language Python 3.12 Core worker implementation
Build Nixpacks Automatic dependency detection
Storage Lighthouse Encrypted decentralized file storage
Blockchain Blockscout MCP Chain data access layer
Oracle Chainlink Price feed verification
Explorer Autoscout Transaction inspection UI

πŸ“Š Production Status

Feature Status Details
Worker Deployment βœ… Live Running on Railway (asia-southeast1)
Lighthouse Upload βœ… Working Encrypted JSONL every 5 minutes
Auto-Cleanup βœ… Working Maintains 1 file only
Rolling Window βœ… Working 24h price tracking
Multi-Chain βœ… Working Base + ETH + Polygon
HTTP Health βœ… Working /health endpoint on port 8787
State Persistence βœ… Working Survives restarts
ASI:One Agent 🚧 Future Conversational interface planned
DataCoin Sales 🚧 Future Dataset monetization planned

πŸ› Recent Fixes

Lighthouse CLI Integration (Oct 2025)

  • βœ… Fixed npm global bin PATH in Nix environment
  • βœ… Added [variables] section to nixpacks.toml for runtime PATH
  • βœ… Lighthouse CLI now available to Python subprocess calls
  • βœ… Auto-cleanup working in production

Import Bug Fix (Oct 2025)

  • βœ… Fixed missing from lighthouse_cleanup import cleanup_lighthouse_storage
  • βœ… Added proper error handling for CLI availability
  • βœ… Added setup verification on startup

οΏ½ Smart Contract Addresses (Sepolia Testnet)

Contract Address Purpose
DataCoin (DADC) 0x8d302FfB73134235EBaD1B9Cd9C202d14f906FeC Token-gated access to premium data
Faucet 0xB0864079e5A5f898Da37ffF6c8bce762A2eD35BB Judges claim tokens here
Liquidity Pool 0x8EF4B1670D382b47DBbF30ebE2Bb15e52Ed2236c DADC/LSDC pool

Token Details:

Deployment Info:

οΏ½πŸ“š Documentation & Resources

Platform Documentation

Blockchain & DeFi

Future Integration

🀝 Contributing

This is a production system. For contributions:

  1. Test locally first with python apps/worker/run.py
  2. Verify Lighthouse upload works
  3. Check auto-cleanup deletes old files
  4. Ensure health endpoint responds
  5. Submit PR with detailed testing notes

πŸ“„ License

Proprietary - All Rights Reserved

About

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published