Skip to content

soote1/bettingtool

Repository files navigation

Table of Contents


What is bettingtool?

bettingtool is an open source odds analysis system. It follows an event processing architecture to perform the following tasks:

  • Extract the odds for different types of bets in a game from online sports betting sites.
  • Emit the odds so that event processors can analyze and produce outcomes.
  • Consume the odds and process them using a multi-objective genetic algorithm to search for an arbitrage opportunity.
  • Emit the arbitrage opportunity when it is found by an event processor.
  • Consume the arbitrage opportunities and loads them on a dashboard for real-time monitoring.

Supported online sports betting sites.

Currently, bettingtool supports data extraction for the following online betting sites:

  • Caliente.

Support for more online sports betting sites coming soon.

Installation Instructions

Docker Compose:

docker-compose build
docker-compose up

Manual installation:

Running bettingtool

To start the bettingtool components you will need to have the following processes running:

  • Local redis instance listening on port 6379 (default).
  • Local rabbitmq server instance listening on port 5672 (default).

Start the extractor:

python -m extractor.sample.main

Start the processor:

python -m processor.sample.main

Start the server:

  • Install dependencies first:
npm install
  • On a terminal:
npm run webpack
  • On a different terminal:
npm start

Start the dashboard:

npm install
ng serve

Testing

Currently, bettingtool supports basic unit tests for several components such as the extractor, processor, and pythontools. You can run them using pytest framework (https://docs.pytest.org/en/stable/getting-started.html).

Steps:

  • From the root directory of any component mentioned in the above section, cd to the "tests" folder.
  • Run the pytest command.

The goal is to support both unit and integration testing for all the components involved in the system.

Community and Contributing

Contributions are welcome. Check out the Contribution Guide.

Directory Structure

├── dashboard                   # UI for monitoring the events produced by the processor.
├── extractor                   # Process manager for the web crawlers.
├── processor                   # Process manager for the processing enginges.
├── pythontools                 # Common logic for the extractor and the processor.
├── server                      # Http server for pushing real-time data to the UI.

Licensing

The code in this project is released under the MIT License.