AliveHunter is an ultra-fast web discovery tool written in Go, designed to check if URLs are alive with maximum speed and zero false positives. Built for reconnaissance and subdomain validation, it outperforms similar tools like httpx and dnsx while maintaining reliability.
- Ultra-fast scanning - 2-3x faster than httpx out-of-the-box
- Zero false positives - Advanced verification to eliminate parked domains and error pages
- Pipeline-friendly - Perfect integration with subdomain discovery tools
- Multiple operation modes - Fast, balanced, and verification modes
- Smart verification - Detects wildcards, parked domains, and default pages
- Title extraction - Both fast and robust HTML parsing options
- JSON output - Structured data for further processing
- Configurable TLS - Support for different TLS versions
- Rate limiting - Built-in rate control and worker management
- Graceful shutdown - Clean termination with progress preservation
| Tool | Default Speed | Accuracy | False Positives |
|---|---|---|---|
| AliveHunter (default) | ~300 req/s | 99.9% | ~0% |
| AliveHunter (fast mode) | ~500+ req/s | 98% | <1% |
| AliveHunter (verify) | ~150 req/s | 100% | 0% |
| httpx | ~100 req/s | 95% | ~5% |
| dnsx | ~200 req/s | 90% | ~10% |
- Go 1.19 or higher
Required dependencies (auto-installed):
github.com/fatih/colorgolang.org/x/net/htmlgolang.org/x/time/rate
git clone https://github.com/Acorzo1983/AliveHunter.git && cd AliveHunter && chmod +x install.sh && ./install.shIf you have Go installed and configured:
go install github.com/Acorzo1983/AliveHunter@latestIf you face network issues (like proxy timeouts) or want to build manually:
git clone https://github.com/Acorzo1983/AliveHunter.git
cd AliveHunter
# If you have proxy issues:
export GOPROXY=direct
# Install dependencies and build
go mod tidy
go build -ldflags "-s -w" -o alivehunter AliveHunter.go
# Install (optional)
sudo mv alivehunter /usr/local/bin/π― For GitHub Releases: With the Makefile you can easily create releases:
# Compile for all platforms
make clean && make build-all
# Binaries will be in build/ ready to upload:
# - alivehunter-linux-amd64
# - alivehunter-windows-amd64.exe
# - alivehunter-darwin-amd64The installer will:
β Check Go version compatibility
β Download and install dependencies automatically
β Build optimized binary with performance flags
β Install to /usr/local/bin/ for global access
β Verify installation and test functionality
A liveHunter reads URLs from stdin, making it perfect for pipeline integration:
Basic Usage
# Single URL
echo "example.com" | alivehunter
# Multiple URLs from file
cat domains.txt | alivehunter
# Silent mode for pipelines
cat domains.txt | alivehunter -silent
# Bugbounty Pipeline
alivehunter -l scope.txt -fast -silent | nuclei -t cves/Perfect for initial filtering of large lists:
cat domains.txt | alivehunter -fast -silent- Speed: ~500+ req/s
- Accuracy: 98%
- Use case: Quick filtering, large datasets
Optimal speed with zero false positives:
cat domains.txt | alivehunter -silent- Speed: ~300 req/s
- Accuracy: 99.9%
- Use case: General reconnaissance, balanced performance
π Verify Mode (Maximum Accuracy) Zero false positives guaranteed:
cat domains.txt | alivehunter -verify -silent- Speed: ~150 req/s
- Accuracy: 100%
- Use case: Final validation, critical targets
Advanced Features Title Extraction
# Fast title extraction
cat domains.txt | alivehunter -title
# Robust HTML parsing for complex pages
cat domains.txt | alivehunter -title -robust-title# Maximum throughput configuration
cat domains.txt | alivehunter -fast -t 200 -rate 300
# Conservative but thorough scanning
cat domains.txt | alivehunter -verify -t 50 -timeout 10sJSON Output
# Structured output for further processing
cat domains.txt | alivehunter -json -silent | jq '.alive'
# Full data extraction with verification
cat domains.txt | alivehunter -json -title -verifyStatus Code Filtering
# Only show specific status codes
cat domains.txt | alivehunter -mc 200,301,302
# Show failed requests for debugging
cat domains.txt | alivehunter -show-failedWith Subfinder
# Basic integration
subfinder -d target.com | alivehunter -silent
# With title extraction
subfinder -d target.com | alivehunter -silent -title
# Complete reconnaissance pipeline
subfinder -d target.com | alivehunter -silent | httpx -title -techWith Other Discovery Tools
# With amass
amass enum -d target.com | alivehunter -fast -silent
# With assetfinder
assetfinder target.com | alivehunter -verify -silent
# Chain with nuclei for vulnerability scanning
cat domains.txt | alivehunter -silent | nuclei -t vulnerabilities/Advanced Multi-Stage Pipelines
# Multi-stage validation for accuracy
subfinder -d target.com | \
alivehunter -fast -silent | \
alivehunter -verify -silent | \
httpx -title -tech -status-code
# JSON processing pipeline
cat domains.txt | \
alivehunter -json -title -silent | \
jq -r 'select(.alive) | .url' | \
nuclei -silentCore Performance Options
-t, -threads int Number of concurrent workers (default: 100)
-rate float Requests per second limit (default: 100)
-timeout duration HTTP request timeout (default: 3s)
-silent Silent mode for pipeline integrationOperation Mode Flags
-fast Maximum speed mode (minimal verification)
-verify Zero false positives mode (comprehensive verification)Output Configuration
-json JSON output format
-title Extract HTML page titles
-robust-title Use robust HTML parser for titles (slower but more reliable)
-show-failed Display failed requests and error detailsFiltering and Matching
-mc string Match only specific status codes (comma separated)
-follow-redirects Follow HTTP redirections (up to 3 hops)
-tls-min string Minimum TLS version: 1.0, 1.1, 1.2, 1.3 (default: 1.2)https://example.com [Example Domain] [200]
https://api.example.com [API Gateway] [VERIFIED]
https://blog.example.com [301]
https://secure.example.com [403]
JSON Output Format
{
"url": "https://example.com",
"status_code": 200,
"content_length": 1256,
"response_time_ms": "45ms",
"title": "Example Domain",
"server": "nginx/1.18.0",
"redirect": "",
"error": "",
"alive": true,
"verified": true
}AliveHunter uses advanced verification to eliminate false positives:
What Gets Automatically Detected
β Parked domains and "Domain For Sale" pages
β Default web server pages (nginx, Apache, IIS welcome pages)
β Error pages disguised as HTTP 200 responses
β Wildcard DNS responses from hosting providers
β CDN and hosting provider placeholder pages
β Suspended account and maintenance pages
Verification Intelligence
- Default mode: Verifies responses from common web servers serving HTML
- Verify mode: Comprehensive verification of all successful responses
- Fast mode: Minimal verification for maximum speed
Common False Positive Patterns Detected code
"domain for sale", "parked domain", "coming soon", "under construction", "default page", "welcome to nginx", "apache2 default", "suspended", "godaddy", "namecheap", "sedo domain parking", "plesk default" π― Real-World Examples Bug Bounty Reconnaissance Workflow
# 1. Subdomain discovery
subfinder -d target.com > subdomains.txt
amass enum -d target.com >> subdomains.txt
# 2. Quick initial filtering (fast mode)
cat subdomains.txt | sort -u | alivehunter -fast -silent > live_initial.txt
# 3. Verification pass (zero false positives)
cat live_initial.txt | alivehunter -verify -title -silent > verified_targets.txt
# 4. Technology detection and further analysis
cat verified_targets.txt | httpx -title -tech -probe > final_results.txtLarge Scale Asset Discovery
# High-performance scanning of large datasets
cat million_subdomains.txt | alivehunter -fast -t 300 -rate 500 -silent > live_fast.txt
# Verify critical findings
cat live_fast.txt | head -1000 | alivehunter -verify -json > verified.jsonSpecific Status Code Hunting
# Find authentication endpoints
cat domains.txt | alivehunter -mc 401,403 -silent > auth_endpoints.txt
# Find redirect chains
cat domains.txt | alivehunter -mc 301,302,307,308 -follow-redirects > redirects.txt
Integration with Security Tools# Nuclei vulnerability scanning
cat domains.txt | alivehunter -silent | nuclei -t cves/ -o vulnerabilities.txt
# Burp Suite scope preparation
cat domains.txt | alivehunter -json | jq -r '.url' > burp_scope.txt
# Custom analysis with Python
cat domains.txt | alivehunter -json | python3 analyze_results.pyPerformance Tuning
# CPU-intensive configuration (utilize all cores)
alivehunter -t $(nproc * 50) -rate 1000
# Memory-conscious scanning for limited resources
alivehunter -t 25 -rate 25 -timeout 15s
# Network-optimized for high bandwidth
alivehunter -fast -t 200 -rate 400 -timeout 1s
# Conservative scanning for unreliable networks
alivehunter -verify -t 10 -rate 5 -timeout 30sSecurity-Focused Configuration
# Modern TLS only (TLS 1.3)
alivehunter -tls-min 1.3
# Compatible with legacy systems (TLS 1.0)
alivehunter -tls-min 1.0
# Secure default (TLS 1.2 - recommended)
alivehunter -tls-min 1.2Specialized Use Cases
# API endpoint discovery
cat api_endpoints.txt | alivehunter -mc 200,401,403 -json
# Subdomain takeover hunting
cat subdomains.txt | alivehunter -verify -json | jq 'select(.status_code == 404)'
# CDN and hosting provider analysis
cat domains.txt | alivehunter -show-failed -json | grep -i "cloudflare\|aws\|azure"No Output Appearing
# Verify input is being piped correctly
echo "google.com" | alivehunter -show-failed
# Check that domains are reachable
echo "httpbin.org" | alivehunterConnection Errors or Rate Limiting
# Reduce rate and increase timeout for problematic networks
alivehunter -rate 10 -timeout 15s
# Use fewer workers for limited bandwidth
alivehunter -t 20 -rate 20Unexpected False Positives
# Enable verification mode for zero false positives
alivehunter -verify
# Use robust title extraction for better analysis
alivehunter -title -robust-titleDebug and Analysis Mode
# Show detailed failure information
alivehunter -show-failed
# Get comprehensive data with JSON output
alivehunter -json -show-failed -title
# Test with known working domains
echo -e "google.com\nhttpbin.org\nexample.com" | alivehunter -show-failedv3.2.1 - Logic fix for title extraction with verification, manual install instructions v3.2 - Advanced verification system, performance optimizations, robust HTML parsing
v3.1 - Enhanced title extraction, improved error handling, verification tracking
v3.0 - Complete rewrite for maximum speed and zero false positives
v2.x - Legacy AliveHunter versions with file input
Q: How is AliveHunter faster than httpx while maintaining accuracy? A: AliveHunter uses optimized connection settings, strategic keep-alive management, HEAD requests by default, and efficient worker pools. The verification system eliminates false positives without sacrificing speed.
Q: When should I use each operation mode? A: Fast mode: Large datasets, initial filtering, time-critical scanning Default mode: General reconnaissance, balanced needs (recommended) Verify mode: Final validation, critical targets, zero tolerance for false positives
Q: Can I use AliveHunter with proxy chains? A: Yes, use proxychains as a wrapper:
proxychains cat domains.txt | alivehunter -silent
Q: How does the verification system work?
A: AliveHunter analyzes response content, headers, and patterns to identify parked domains, default pages, and hosting provider placeholders. It uses a comprehensive database of false positive indicators.Q: What's the recommended configuration for bug bounty hunting? A: Start with fast mode for scope validation, then use verify mode for critical targets:
# Initial scope validation
cat scope.txt | alivehunter -fast -silent > live.txt
# Critical target verification
cat priority_targets.txt | alivehunter -verify -title -json > verified.jsonπ Related Tools and Integration
-
Subfinder - Subdomain discovery
-
httpx - HTTP probing (can be used after AliveHunter)
-
Nuclei - Vulnerability scanning
-
dnsx - DNS toolkit
-
amass - Network mapping
π License This project is licensed under the MIT License - see the LICENSE file for details.
π Acknowledgments
Inspired by Project Discovery's excellent security tools
Built for the cybersecurity research community
Optimized for real-world penetration testing and bug bounty workflows
Thanks to all contributors and the open-source security community
Contributions are welcome! Please feel free to submit pull requests, report bugs, or suggest new features.
Fork the repository
Create a feature branch (git checkout -b feature/amazing-feature)
Commit your changes (git commit -m 'Add amazing feature')
Push to the branch (git push origin feature/amazing-feature)
Open a Pull Request
If you encounter any issues or have questions:
π Bug Reports: Open an issue with detailed information
π‘ Feature Requests: Describe your use case and proposed solution
π Documentation: Check examples and FAQ sections first
π¬ Community: Share your AliveHunter workflows and tips
AliveHunter - When speed and accuracy matter in web reconnaissance
β Star this repository if AliveHunter helps you in your security research!