Have you ever wanted an absolutely pointless list of domains to visit and discover? Well this solves that! Quickly spin it up, start it and discover a tons of absolutely pointless domains and websites. They will be nicely stored in a database for you to view them "when you have time", e.g. never.
I just wanted to learn Rust, and I had this idea for a while so I went for it.
cargo build
cargo runCopy the docker-compose.yml to your local machine and run
docker-compose up -dThe database will persist in ./data/domains.db
The default port is 1330, so go to http://localhost:1330 or whatever nginx proxy you mad people decide to set this up on.
I have let this run for an hour or so, to benchmark performance, scanned thousands of domains and these were the very stable results I got, seems to be stupidly performant.
- CPU: ~0.5-1% average during active scanning
- Memory: ~34MB (debug), ~20-25MB (release)
Assuming you keep this on 24/7, this is what you are going to be looking at:
- 8 requests/second
- 480 requests/minute
- 28,800 requests/hour
- 691,200 requests/day
- ~20.7 million requests/month
- Build script downloads 370k+ English words
- Scanner combines 1, 2 or 3 words with TLDs (.com, .net, .org and so on)
- TRIES to filter out placeholder/parking pages
- Extracts meta description or first paragraph
- Stores potentially valid domains in SQLite
