Generate avro schemas from python dataclasses, Pydantic models and Faust Records. Code generation from avro schemas. Serialize/Deserialize python instances with avro schemas.
-
Updated
Nov 6, 2024 - Python
Generate avro schemas from python dataclasses, Pydantic models and Faust Records. Code generation from avro schemas. Serialize/Deserialize python instances with avro schemas.
Graph streaming framework
A highly-configurable, real-time data quality monitoring tool designed for streaming data
A simple Python-based distributed workflow engine
An end-to-end, web event stream processing pipeline
Django with Kafka, Debezium, and Faust for Email Sending using Change Data Capture
This project cover latest skills to process data in real-time by building fluency in modern data engineering tools, such as Apache Spark, Kafka, Spark Streaming, and Kafka Streaming.
Python bindings for RocksDB used by faust-streaming
The goal of this project is aimed at optimizing Bank Marketing Model through building an event streaming pipeline around Apache Kafka and its ecosystem that communicates with a Machine learning model microservice. Utilizing this to display the likelihood and status of Bank Customers in real time.
Projects completed in the Udacity Data Streaming Nanodegree program. Tech used: Apache Kafka, Kafka Connect, KSQL, Faust Stream Processing, Spark Structured Streaming
Add a description, image, and links to the faust-streaming topic page so that developers can more easily learn about it.
To associate your repository with the faust-streaming topic, visit your repo's landing page and select "manage topics."