dataminer is a MTConnect utility to demonstrate how to collect data from MT agents and store them in a scalable data warehouse so they can be searched, analyzed, and visualized it in real time. It is written in C++ using boost and openssl libraries.
In this demonstration, we use Open Source Elastic Search technology stack, ELK, to build a scalable data warehouse. ELK has three key components:
ElasticSearch is a distributed, search and analytics engine.
LogStash Logstash is a data processing pipeline that ingests data from various sources simultaneously, transforms it, and then sends it to ElasticSearch data engine.
Kibana visualizes the data from the ElasticSearch data engine.
The following diagram depicts The interaction among MTConnect agents, dataminer and ELK components:
Here is a screenshot from Kibana displaying XYZ positions of a device in real time:
dataminer is written in C++ using boost and openssl libraries. It uses CMake as the build system. First download and install them:
- CMake
- boost
- openssl - This is to support https secure protocol. For Windows, after the build, prepend the location of libcrypto.dll and libssl.dll to the PATH system variable.
Then run these commands:
- cmake Makefiles.list
- cmake --build . --target all
If build successful, dataminer should be generated in the current directory.
Usage:
- dataminer [ json file output location ] [ MTConnect Agent URL address ]
-
Start ElasticSearch and Kibana service with the default configuration.
-
Configure LogStash to digest the json files generated by dataminer.
First, update path variable in MTDataMiner.conf for the location of the output json files from dataminer. For example:
path => [ "/tmp/MTComponentStreams-*.log" ]
Copy MTDataMiner.conf to logstash local config location like /usr/local/etc/logstash. Update/Add path.config in logstash.yml to use this MTDataMiner.conf.
path.config: /usr/local/etc/logstash/MTDataMiner.conf
-
Start LogStash service.
-
Verify the setup, login to Kibana and look for index mtconnect-demo. If found, create an index pattern from mtconnect-demo.
You can then start analyzing your data via ELK.
- Output data items (sample, event or condition) to individual json records with their associated device/component id attributes.
- Determine numeric data items by using info from "probe" request.
- Poll data history using "sample" requests and track last request's sequence number so it can recover on restart. If last sequence number is not present, it starts from the snapshot of current data.
- Support polling from multiple agents with different frequencies.
Download MacOS Installation
Download Windows Installation