As Wikipedia states, Complex event processing, or CEP, is event processing that combines data from multiple sources to infer events or patterns that suggest more complicated circumstances.
redborder CEP executes a set of rules which, as said, can infer events, patterns and sequences. The input of these rules are all the events from a set of kafka topics, and the result of these calculations are inserted on a new set of kafka topics.
The rules are exposed with a REST API, so you can add, remove or list rules on the fly.
The unit of work on redborder CEP are called rules. The rules are written on JSON and accessed and modified via the REST API. They are a combination of execution plans from a library called Siddhi and some more data to help you map your kafka topics to Siddhi streams. You can find more information about Siddhi and execution plans below.
Take a look at the rules wiki page for information about how to write rules.
The REST API lets you add, remove, list and synchronize rules on the fly. Find more information about it at the REST API wiki page.
The engine of redborder CEP is called Siddhi. It's a Java library made by WSO2 that acts as a event processor engine. It lets you work with data streams and combine, analyze and join them in any way you want.
For you to know how to write execution plans, which are the body of our rules, you must know the Siddhi Query Language, a simple SQL-like language that lets you work with data streams. Some features available are:
- Aggregate functions like averages, sums, max, min, stddev and counts.
- Filters and Query Projection using mathematical and logical expressions.
- Default value to an attribute.
- Inbuilt functions.
- Renaming attributes.
- Eval script allows Siddhi to process events using other programming languages (JavaScript, R and Scala) by defining functions by them.
- And... virtually anything you want.
You can find the complete specification for the language on WSO2's documentation site
You can find more infomation about how redborder CEP works with Siddhi on the siddhi wiki page.
You will need a config file to specify the topics that will be read from kafka, the attributes that are part of the stream with its type, and a few other options like the zookeeper nodes, the kafka brokers and the REST URI that will be used to serve the REST API.
Find more information about the config file on the config file wiki page
- Fork it
- Create your feature branch:
git checkout -b my-new-feature
- Commit your changes:
git commit -am 'Add some feature'
- Push to the branch:
git push origin my-new-feature
- Create a new Pull Request