Sending and Receiving messages

Let’s learn how key solves the problem of messages to random partitions.

Before starting this tutorial, please create a Topic first. Learn How to create a Topic in Kafka.

  • The partitioner first checks whether the key is present as part of the message or not. …

  • Spring is a Dependency Injection framework to make java applications loosely coupled.
  • Spring framework makes the development process easy for JavaEE applications.
  • Spring enables you to build applications from “plain old Java objects” (POJOs) and to apply enterprise services non-invasively to POJOs.
  • Spring was developed by Rod Johnson in 2003.

Introduction to Spring Framework

Spring Framework
Official — Spring Framework

I) Core Container

Secrets of Kafka console producers

Yes, it is possible to create a topic using console producer. But this is really a bad idea to create a topic. Read the blog below for details.

Do Kafka console producers can also create topics?

— Yes, it is possible but not a good idea to create a topic in this way. --bootstrap-server --topic new_topic

Creating and Deleting the Topics

Learn how to create and delete the topics.

Getting Started

Spring KafkaTemplate

Kafka template is a class that is part of the Spring to produce messages into the Kafka Topics.


→ Think KafkaTemplate as a JDBCTemplate for database interactions.

The KafkaTemplate wraps a producer and provides convenient methods to send data to Kafka topics. The following listing shows the relevant methods from KafkaTemplate:

— If you look below it has many different overloaded versions of KafkaTemplate with send method.


Internal Working of Kafka Template.

Do you have an understanding of how Kafka Template works internally?

The KafkaTemplate.send() is going to send the message to Kafka. But in reality, it goes through different layers before the message is sent to Kafka.

— The very first layer is the serializer. Any records sent to Kafka need to be serialized to bytes.

There are two serialization techniques that…

Consumer Group

The consumer group-id is mandatory, it plays a major role when it comes to scalable message consumption.

  • To start a consumer group-id is mandatory.
  • group-id plays a major role when it comes to scalable message consumption.

let’s consider we have a topic test-topic and 4 partitions. Now we have a consumer-ready with group 1. …

Handling Fault Tolerance

Make sure that 3 instances of brokers are running along with the zookeeper and server.

  1. Run the Kafka producer
.\bin\windows\kafka-console-producer.bat --broker-list localhost:9092 --topic test-topic-replicated

Kafka Cluster distributes the Client Requests between brokers

Before we get into the details of how Kafka distributes a client request, we will discuss how Topics are distributed.

We have a zookeeper and Kafka cluster. In this example, we have a cluster with 3 brokers.

  • out of the 3 brokers, one broker will behave as a Controller. Normally this is the first broker to join the cluster. Think of this as an Additional role for the broker. …

Handling Data Loss.

Learn how Kafka handles data loss in the event of failure.

Here we have a Kafka cluster and a representation of how the topic is distributed across the cluster and we have some records present in the file system.

As we all know clients are producers and consumers always talk to the leader to retrieve data from the partition.

Let’s say…

Sagar Kudu

Software Engineer at HCL | Technical Content Writing | Follow me on LinkedIn

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store