This project consist of two parts: scrapper and listener. Spider walks through the web-page and sends payloads, that it found to the listener. Listener waiting for payloads and sends it to POLYSE database using SDK.
go get github.com/polyse/web-scrapper
- Import package
import ws "github.com/polyse/web-scrapper"
- Install and start RabbitMQ.
- Start polySE database on <example_host>:<example_port>
- Run new spider like :
cd cmd\daemon go build daemon.exe
- Run new listener like :
cd cmd\listener go build listener.exe
- Send POST-message with auth Bearer token like :
localhost:7171/start?url=http://go-colly.org
- Enjoy results.