Consuming and Producing Messages
Explore how to use kafka-console-producer.sh and kafka-console-consumer.sh tools to manually produce and consume messages in Kafka. Understand the process of creating topics, sending messages, and verifying streams output to aid development and debugging in Kafka Streams applications.
We'll cover the following...
In a Kafka Streams application, data processing is triggered by incoming messages consumed from one or more Kafka topics. In some applications, the stream processing ends with messages produced to one more topic. This happens automatically because our application is usually a part of a larger system where upstream applications are the producers of our incoming messages, and downstream applications are the consumers of our outbound messages.
However, verifying that the application we are building is working as expected during the different stages of implementation is an important aspect of our job as developers. Being able to manually trigger our Kafka Streams application on demand would make our work easier and allow us to catch bugs and fix them earlier in the development cycle. The same is true for the other end of the ...