Skip to content

Data ingest and visualization using Kibana and Elastic Search

License

Notifications You must be signed in to change notification settings

luizeo/dataminer

 
 

Repository files navigation

dataminer is a MTConnect utility to demonstrate how to collect data from MT agents and store them in a scalable data warehouse so they can be searched, analyzed, and visualized it in real time. It is written in C++ using boost and openssl libraries.

In this demonstration, we use Open Source Elastic Search technology stack, ELK, to build a scalable data warehouse. ELK has three key components:

ElasticSearch is a distributed, search and analytics engine.

LogStash Logstash is a data processing pipeline that ingests data from various sources simultaneously, transforms it, and then sends it to ElasticSearch data engine.

Kibana visualizes the data from the ElasticSearch data engine.

The following diagram depicts The interaction among MTConnect agents, dataminer and ELK components:

dataminer_ELK

Here is a screenshot from Kibana displaying XYZ positions of a device in real time:

dataminer_Kibana

Building

dataminer is written in C++ using boost and openssl libraries. It uses CMake as the build system. First download and install them:

  • CMake
  • boost
  • openssl - This is to support https secure protocol. For Windows, after the build, prepend the location of libcrypto.dll and libssl.dll to the PATH system variable.

Then run these commands:

  • cmake Makefiles.list
  • cmake --build . --target all

If build successful, dataminer should be generated in the current directory.

Usage:

  • dataminer [ json file output location ] [ MTConnect Agent URL address ]

ELK Configuration

  1. Start ElasticSearch and Kibana service with the default configuration.

  2. Configure LogStash to digest the json files generated by dataminer.

    First, update path variable in MTDataMiner.conf for the location of the output json files from dataminer. For example:

    path => [ "/tmp/MTComponentStreams-*.log" ]

    Copy MTDataMiner.conf to logstash local config location like /usr/local/etc/logstash. Update/Add path.config in logstash.yml to use this MTDataMiner.conf.

    path.config: /usr/local/etc/logstash/MTDataMiner.conf

  3. Start LogStash service.

  4. Verify the setup, login to Kibana and look for index mtconnect-demo. If found, create an index pattern from mtconnect-demo.

You can then start analyzing your data via ELK.

v1.1.0 Release Notes

  1. Output data items (sample, event or condition) to individual json records with their associated device/component id attributes.
  2. Determine numeric data items by using info from "probe" request.
  3. Poll data history using "sample" requests and track last request's sequence number so it can recover on restart. If last sequence number is not present, it starts from the snapshot of current data.
  4. Support polling from multiple agents with different frequencies.

Binary Releases v1.1.0

Download MacOS Installation

Download Windows Installation

About

Data ingest and visualization using Kibana and Elastic Search

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • C++ 94.7%
  • CMake 5.3%