Skip to content

The command-line tool for producing and consuming Apache Kafka messages using the Avro serialization format. It interacts with Schema Registry (tested with Confluent Schema Registry and Apicurio) to manage Avro schemas, providing a simple way to work with Kafka and Avro in your projects.

License

Notifications You must be signed in to change notification settings

stn1slv/kafka-console-avro-tools

Repository files navigation

Kafka Avro Producer and Consumer

This application demonstrates how to use Kafka with Avro serialization and deserialization. It includes a producer and a consumer that can send and receive Avro-encoded messages using the Confluent Schema Registry.

Prerequisites

  • Go 1.16 or later
  • Kafka cluster with Confluent Schema Registry

Dependencies

Usage

Build

To build the application, run the following command:

go build -o kafka-avro

Producer

To run the producer, use the following command:

./kafka-avro producer [flags]

Flags:

  • -t, --topic: Name of the topic (default: "output")
  • -b, --brokerList: Kafka broker list (default: "localhost:9092")
  • --schemaId: Schema ID
  • --schemaRegistryURL: Schema Registry URL (default: "http://localhost:8081")
  • -m, --msg: Message
  • -f, --file: Filename
  • -a, --auth: Auth type (default: "wo"). Supported options:
    • wo - without authentication
    • tls - TLS/SSL authentication
  • --certFile: TLS certificate file (in pem format) (default: "./client.cer.pem")
  • --keyFile: TLS key file (in pem format) (default: "./client.key.pem")
  • --caCertFile: TLS CA certificate file (in pem format) (default: "./server.cer.pem")

Consumer

To run the consumer, use the following command:

./kafka-avro consumer [flags]

Flags:

  • -t, --topic: Name of the topic (default: "output")
  • -b, --brokerList: Kafka broker list (default: "localhost:9092")
  • -g, --group: Consumer group (default: "kafka-console-avro-tools")
  • --schemaRegistryURL: Schema Registry URL (default: "http://localhost:8081")
  • -a, --auth: Auth type (default: "wo"). Supported options:
    • wo - without authentication
    • tls - TLS/SSL authentication
  • --certFile: TLS certificate file (in pem format) (default: "./client.cer.pem")
  • --keyFile: TLS key file (in pem format) (default: "./client.key.pem")
  • --caCertFile: TLS CA certificate file (in pem format) (default: "./server.cer.pem")

Testing

  1. Register simple Avro schema
curl -s -L 'http://localhost:8081/subjects/Kafka-value/versions' \
-H 'Content-Type: application/vnd.schemaregistry.v1+json' \
-d '{"schema": "{\"type\":\"record\",\"name\":\"Record\",\"fields\":[{\"name\":\"name\",\"type\":\"string\"},{\"name\":\"address\",\"type\":\"string\"}]}"}'
  1. Produce a message to kafka topic
./kafka-avro producer --schemaId=1 --file examples/example.json
  1. Consume messages from kafka topic
./kafka-avro consumer

License

This project is licensed under the MIT License.

About

The command-line tool for producing and consuming Apache Kafka messages using the Avro serialization format. It interacts with Schema Registry (tested with Confluent Schema Registry and Apicurio) to manage Avro schemas, providing a simple way to work with Kafka and Avro in your projects.

Topics

Resources

License

Stars

Watchers

Forks

Languages