Consuming and Producing Messages

Learn how to produce and consume messages using the CLI.

In a Kafka Streams application, data processing is triggered by incoming messages consumed from one or more Kafka topics. In some applications, the stream processing ends with messages produced to one more topic. This happens automatically because our application is usually a part of a larger system where upstream applications are the producers of our incoming messages, and downstream applications are the consumers of our outbound messages.

However, verifying that the application we are building is working as expected during the different stages of implementation is an important aspect of our job as developers. Being able to manually trigger our Kafka Streams application on demand would make our work easier and allow us to catch bugs and fix them earlier in the development cycle. The same is true for the other end of the application—being able to manually consume the messages produced by our application will help us verify that the data is processed and published correctly.

Following up on that logic, we would now add two more tools to our Kafka tool belt:

  • Using kafka-console-producer.sh to produce messages to a Kafka topic.

  • Using kafka-console-consumer.sh to consume messages from a Kafka topic.

Make sure to start Kafka by running the following command:

Get hands-on with 1400+ tech skills courses.