Skip to content

autocore-ai/zenoh-flow-examples

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Eclipse Zenoh-Flow Examples

Join the chat at https://gitter.im/atolab/zenoh-flow

Zenoh-Flow provides a zenoh-based dataflow programming framework for computations that span from the cloud to the device.

⚠️ This software is still in alpha status and should not be used in production. Breaking changes are likely to happen and the API is not stable.


Description

Zenoh-Flow allow users to declare a dataflow graph, via a YAML file, and use tags to express location affinity and requirements for the operators that makeup the graph. When deploying the dataflow graph, Zenoh-Flow automatically deals with distribution by linking remote operators through zenoh.

A dataflow is composed of set of sources — producing data, operators — computing over the data, and sinks — consuming the resulting data. These components are dynamically loaded at runtime.

Remote source, operators, and sinks leverage zenoh to communicate in a transparent manner. In other terms, the dataflow the dafalow graph retails location transparency and could be deployed in different ways depending on specific needs.

Zenoh-Flow provides several working examples that illustrate how to define operators, sources and sinks as well as how to declaratively define they dataflow graph by means of a YAML file.


Examples

Runtime

First, let's build an example runtime to run the examples

cargo build --release -p runtime

This will create the runtime binary in ./target/release/runtime


FizzBuzz

First, compile the relevant examples:

cargo build --release -p manual-source -p example-fizz -p example-buzz -p generic-sink

This will create, depending on your OS, the libraries that the pipeline will fetch.

Single runtime

To run all components on the same Zenoh Flow runtime:

./target/release/runtime --graph-file ./graphs/fizz_buzz_pipeline.yaml --runtime foo

Note: in that particular case the --runtime foo is discarded.

Multiple runtimes

In a first machine, run:

./target/release/runtime --graph-file ./graphs/fizz-buzz-multiple-runtimes.yaml --runtime foo

In a second machine, run:

./target/release/runtime --graph-file ./graphs/fizz-buzz-multiple-runtimes.yaml --runtime bar

⚠️ If you change the name of the runtime in the yaml file, the name(s) passed as argument of the previous commands must be changed accordingly.

⚠️ Without configuration, the different machines need to be on the same local network for this example to work. See how to add a Zenoh router if you want to connect them through the internet.


OpenCV FaceDetection - Haarcascades

⚠️ This example works only on Linux and it require OpenCV to be installed, please follow the instruction on the OpenCV documentation to install it.

⚠️ You need a machine equipped of a webcam in order to run this example.

First, compile the relevant examples:

cargo build --release -p camera-source -p face-detection -p video-sink

This will create, depending on your OS, the libraries that the pipeline will fetch.

Single runtime

To run all components on the same Zenoh Flow runtime:

./target/release/runtime --graph-file ./graphs/face_detection.yaml --runtime foo

Note: in that particular case the --runtime foo is discarded.

Multiple runtimes

In a first machine, run:

./target/release/runtime --graph-file ./graphs/face-detection-multi-runtime.yaml --runtime gigot

In a second machine, run:

./target/release/runtime --graph-file ./graphs/face-detection-multi-runtime.yaml --runtime nuc

In a third machine, run:

./target/release/runtime --graph-file ./graphs/face-detection-multi-runtime.yaml --runtime leia

⚠️ If you change the name of the runtime in the yaml file, the name(s) passed as argument of the previous commands must be changed accordingly.

⚠️ Without configuration, the different machines need to be on the same local network for this example to work. See how to add a Zenoh router if you want to connect them through the internet.


OpenCV Object Detection - Deep Neural Network - CUDA powered

⚠️ This example works only on Linux and it require OpenCV with CUDA enabled to be installed, please follow the instruction on this gits to install it.

⚠️ This example works only on Linux and it require a CUDA capable NVIDIA GPU, as well as NVIDIA CUDA and CuDNN to be installed, please follow CUDA instructions and CuDNN instructions.

⚠️ You need a machine equipped of a webcam in order to run this example.

⚠️ You need to download a YOLOv3 configuration, weights and classes, you can use the ones from this GitHub repository.

First, compile the relevant examples:

cargo build --release -p camera-source -p object-detection-dnn -p video-sink

This will create, depending on your OS, the libraries that the pipeline will fetch.

Then please update the files ./graphs/dnn-object-detection.yaml and ./graphs/dnn-object-detection-multi-runtime.yaml by changing the neural-network, network-weights, and network-classes to match the absolute path of your Neural Network configuration

Single runtime

To run all components on the same Zenoh Flow runtime:

./target/release/runtime --graph-file ./graphs/dnn-object-detection.yaml --runtime foo

Note: in that particular case the --runtime foo is discarded.

Multiple runtimes

In a first machine, run:

./target/release/runtime --graph-file ./graphs/dnn-object-detection-multi-runtime.yaml --runtime foo

In a second machine, run:

./target/release/runtime --graph-file ./graphs/dnn-object-detection-multi-runtime.yaml --runtime cuda

In a third machine, run:

./target/release/runtime --graph-file ./graphs/dnn-object-detection-multi-runtime.yaml --runtime bar

⚠️ If you change the name of the runtime in the yaml file, the name(s) passed as argument of the previous commands must be changed accordingly.

⚠️ Without configuration, the different machines need to be on the same local network for this example to work. See how to add a Zenoh router if you want to connect them through the internet.

OpenCV Car Vision - Deep Neural Network - CUDA powered

Car vision dataflow

⚠️ This example works only on Linux and it require OpenCV with CUDA enabled to be installed, please follow the instruction on this gits to install it.

⚠️ This example works only on Linux and it require a CUDA capable NVIDIA GPU, as well as NVIDIA CUDA and CuDNN to be installed, please follow CUDA instructions and CuDNN instructions.

⚠️ You need a machine equipped of a webcam in order to run this example.

⚠️ You need to download a YOLOv3 configuration, weights and classes, you can use the ones from this GitHub repository.

⚠️ You need to download a camera car video, you can use the ones from this data set. This dataset contains the frames, in oder to merge them in video you need ffmpeg and run the following command: ffmpeg -framerate 15 -pattern_type glob -i 'I1*.png' -c:v libx264 I1.mp4.

First, compile the relevant examples:

cargo build --release -p video-file-source -p object-detection-dnn -p video-sink

This will create, depending on your OS, the libraries that the pipeline will fetch.

Then please edit the file ./graphs/car-pipeline-multi-runtime.yaml by changing the neural-network, network-weights, and network-classes to match the absolute path of your Neural Network configuration.

You also need to edit the file in ./graphs/car-pipeline-multi-runtime.yaml to match the absolute path of your video file.

Multiple runtimes

In a first machine, run:

./target/release/runtime --graph-file ./graphs/car-pipeline-multi-runtime.yaml --runtime gigot

In a second machine, run:

./target/release/runtime --graph-file ./graphs/car-pipeline-multi-runtime.yaml --runtime cuda

In a third machine, run:

./target/release/runtime --graph-file ./graphs/car-pipeline-multi-runtime.yaml --runtime macbook

⚠️ If you change the name of the runtime in the yaml file, the name(s) passed as argument of the previous commands must be changed accordingly.

⚠️ Without configuration, the different machines need to be on the same local network for this example to work. See how to add a Zenoh router if you want to connect them through the internet.

Releases

No releases published

Packages

No packages published

Languages

  • Rust 100.0%