This is an application for consuming data exposed by the pub/sub system in the RADAR-IoT framework and processing/uploading to to different destinations. The application consumes data from channels in a pub/sub system and can consume it in various ways. One such implementation provided is uploading the data to the RADAR-base platform's backend.
These are the explanation of the main terms used in the context of this application -
Handler
- This is used for Handling data coming in from the pub/sub system. It usually proxies data from the pub/sub and forwards data to theDataConsumer
s.Consumer
orDataConsumer
- These are the components that handle the actual processing of the data that is received from theHandler
s.Converter
- These are used to convert the message from one format to another. So that data can be read from the pub/sub system and sent/processed by the destination system(for DataConsumers). Hence these are specific to a particular sensor for a particular consumer. One implementation of a converter can be - deserializing a message received in JSON format from the pub/sub system and serializing it to AVRO format for uploading to kafka.Connection
- Connection represents a connection to a external entity (like pub/sub system or a destination entity like influxdb).
Configuration files can be added to classpath or it's location can be defined by the environment variable RADAR_IOT_CONFIG_LOCATION
.
An example file can be found in radar_iot_config.yaml.
This can be run as a normal jvm application by running the main class org.radarbase.iot.DataUploaderApplication.kt
or using gradle wrapper with ./gradlew run
.
A Dockerfile is provided for convenience of deploying the application. A docker-compose file is also provided to enable the application and it's dependencies to be deployed together. The docker-compose file is present here.
You will first need to configure the application by copying the radar_iot_config.yaml.template to radar_iot_config.yaml and updating the configuration values.
You can then run the applications as docker-compose -f docker/data-uploader.yml up -d
.
For the Rest Proxy Data Consumer, by default, authorization is enabled using Management Portal and hence the uploader will work with Gateway as well as Rest Proxy
For contributing, please take a look at the commons
module and data-uploader
module. Code should be formatted using the Kotlin Style Guide. If you want to contribute a feature or fix, browse our issues and please make a pull request.
For simple additions like new sensors, please take a look in the converter package where you will need to add new Converters for your sensor based on the consumers.
Various bits of the application can be extended -
- To extend the authorizer, look at extending the Authorizer interface in the [commons]((../commons) module. Currently, only ManagementPortalAuthorizer is implemented.
- To add another data consumer/processor (like RestProxyDataConsumer), extend the DataConsumer abstract class and specify how you want to process the data. Also see InfluxDb consumer.
- To add a new type of data format for reading data from the pub/sub system (Currently json is supported via JsonMessageParser class), you can just implement the
Parser.kt
interface from theutil
package incommons
module and then pass that in the Converters - Currently, a Redis based handler for data is provided in RedisDataHandler, but other handlers can be added by implementing the Handler interface. This will then need to be added to the array of handlers in the main class.
- The communication with the pub/sub system is handled using a connection and a subscriber. Both of which can be be extended by implementing their respective interfaces.