Review the concepts covered throughout this course.

We'll cover the following

Final words

This course covered how to build applications using Apache Kafka and Java. We learned about the core Kafka Java client library features, including the Producer and Consumer APIs and their nuances. We took this further with Kafka Streams and learned how to process streaming data using stateful and stateless operations of the Kafka Streams DSL API, along with its testing and using the Interactive Queries feature. The Kafka Connect module covered how to build data pipelines using Kafka connectors (Source and Sink) and transform them. Finally, we explored some Kafka ecosystem projects, including the Spring Kafka library, used Schema Registry to exchange Avro formatted data, and learned about Kafka MirrorMaker.

If you completed this course as a software developer, data engineer, or data professional, you should now be empowered with the basic knowledge required to use Kafka to build data-intensive solutions.

Get hands-on with 1200+ tech skills courses.