An agentic system that optimizes leads based on true signals and intent. Built for B2B SaaS sales teams who struggle with low response rates from generic cold outreach.
- Multi-Source Lead Scraping: Scrapes buying intent signals from Reddit and LinkedIn using Apify
- AI-Powered Processing: 4-agent CrewAI system (Signal Scout, Researcher, Pitch Architect, Auditor)
- Intent Detection: Identifies active buying intent signals and trigger events
- Hyper-Personalized Pitches: Generates custom outreach messages in under 2 minutes
- Quality Scoring: Validates leads using Kalibr/Pycalib with buyability scores (0-100)
- Monetization: Nevermined integration for payment gatekeeping on high-value leads (score >= 80)
- Modern UI: Next.js frontend with dark, futuristic "cyber-pro" theme
- MCP Integration: Exposes results via Model Context Protocol for Cursor integration
Apify Scraper → CrewAI Agents → Kalibr/Pycalib Validation → Nevermined Monetization → Next.js UI
-
Apify Scraper (
tools/apify_scraper.py)- Scrapes Reddit and LinkedIn for buying intent signals
- Filters posts with intent keywords
- Returns structured lead data
-
CrewAI Agents (
agents/crew_setup.py)- Signal Scout: Identifies active intent signals
- Researcher: Gathers company intelligence
- Pitch Architect: Creates custom pitches
- Auditor: Validates and scores leads
-
FastAPI Backend (
api/main.py)- RESTful API for lead management
- Nevermined integration endpoints
- Protected asset creation
-
Next.js Frontend (
frontend/)- Dashboard with lead feed
- Lead detail modals
- Unlock mechanism for high-value leads
- Python 3.13+
- Node.js 18+
- OpenAI API key
- Apify API token
- Nevermined API key (optional, for monetization)
git clone https://github.com/oabolade/Lead_Sniper_AI.git
cd Lead_Sniper_AI# Create virtual environment
python3.13 -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Set up environment variables
cp .env.example .env
# Edit .env and add your API keys:
# - OPENAI_API_KEY
# - APIFY_API_TOKEN
# - NVM_API_KEY (optional)cd frontend
npm install
# Set up environment variables
echo "NEXT_PUBLIC_API_URL=http://localhost:8000" > .env.localTerminal 1 - Backend:
source venv/bin/activate
python api/run_server.pyTerminal 2 - Frontend:
cd frontend
npm run dev# Run the complete workflow
./START_WORKFLOW.sh
# Or add test leads for testing
./add_test_leads.shGET /health- Health checkGET /api/leads- Get all leadsPOST /api/scrape-and-process- Scrape and process leadsPOST /api/process- Process a single leadPOST /api/unlock- Unlock a protected leadGET /docs- Interactive API documentation
- Scrape Leads: Use expanded keywords and multiple sources
- Process: Leads go through 4-agent CrewAI pipeline
- Score: Each lead gets a buyability_score (0-100)
- Protect: High-value leads (score >= 80) are protected
- Unlock: Users pay via Nevermined to unlock full details
lead_sniper_ai/
├── agents/ # CrewAI agent definitions
│ ├── crew_setup.py # Main crew configuration
│ └── agent_personas.py # Agent personas
├── tools/ # Apify scraper tools
│ └── apify_scraper.py
├── api/ # FastAPI backend
│ ├── main.py # Main API server
│ ├── nevermined_middleware.py
│ └── run_server.py
├── frontend/ # Next.js frontend
│ ├── app/
│ ├── components/
│ └── lib/
└── requirements.txt # Python dependencies
.envfiles are gitignored- API keys should never be committed
- Use environment variables for all secrets
See TROUBLESHOOTING.md for common issues and solutions.
MIT License
Built for a 24-hour hackathon. Uses:
- CrewAI for agent orchestration
- Apify for web scraping
- Nevermined for monetization
- Next.js for frontend
✅ Core functionality complete ✅ End-to-end workflow tested ✅ UI implemented ✅ Nevermined integration ready
Built with ❤️ for the hackathon