Apache Kafka Java Example

Share

Apache Kafka Java Example

Apache Kafka: A Java Primer

Apache Kafka has become indispensable for building scalable, distributed, and real-time data streaming applications. If you’re a Java developer looking to integrate Kafka into your projects, this blog is for you! We’ll explore Kafka’s core concepts and provide a hands-on Java example to get you started.

What is Apache Kafka?

At its heart, Apache Kafka is a distributed publish-subscribe messaging system designed for high throughput and fault tolerance. Let’s break down some key terms:

  • Topics: Kafka organizes data streams into logical categories called topics.
  • Producers: Applications that send messages (data) to Kafka topics.
  • Consumers: Applications that read messages from Kafka topics.
  • Brokers: Kafka servers that manage the storage and replication of messages.
  • Partitions: Topics are divided into partitions for scalability and parallelism.

Why Kafka?

  • Scalability: Kafka’s distributed architecture smoothly handles massive volumes of data.
  • High Throughput: Kafka delivers messages with extremely low latency, enabling real-time data processing.
  • Fault Tolerance: Data replication across brokers ensures the system remains operational even during failures.
  • Persistence: Kafka stores messages reliably on disk, preventing data loss.

Setting Up a Basic Java Example

Now, let’s dive into a simple Java example. We’ll set up a producer to send messages and a consumer to receive them.

  1. Dependencies: Include the Kafka Client library in your Java project. Using Maven, add the following to your pom.xml:
  2. XML
  3. <dependency>
  4.     <groupId>org.apache.kafka</groupId>
  5.     <artifactId>kafka-clients</artifactId>
  6.     <version>3.3.1</version> </dependency>
  7. Use code 
  8. content_copy
  9. Kafka Producer:
  10. Java
  11. import java.util.Properties;
  12. import org.apache.kafka.clients.producer.KafkaProducer;
  13. Import org.apache.kafka.clients.producer.ProducerRecord;
  14.  
  15. public class SimpleProducer {
  16.     public static void main(String[] args) {
  17.         Properties props = new Properties();
  18.         props.put(“bootstrap.servers”, “localhost:9092”); 
  19.         props. Put (“key. serializer”, “org.apache.kafka.common.serialization.StringSerializer”);
  20.         Props. put(“value. serializer”, “org.apache.kafka.common.serialization.StringSerializer”);
  21.  
  22.         KafkaProducer<String, String> producer = new KafkaProducer<>(props);
  23.  
  24.         ProducerRecord<String, String> record = new ProducerRecord<>(“my-topic,” “key,” “Hello, Kafka!”);
  25.         producer.send(record);
  26.         producer.close();
  27.     }
  28. }
  29. Use code 
  30. content_copy
  31. Kafka Consumer:
  32. Java
  33. import java.util.Properties;
  34. import java. Time.Duration;
  35. import java. Util.Arrays;
  36. import org.apache.kafka.clients.consumer.KafkaConsumer;
  37. import org.apache.kafka.clients.consumer.ConsumerRecords;
  38. Import org.apache.kafka.clients.consumer.ConsumerRecord;
  39.  
  40. public class SimpleConsumer {
  41.     public static void main(String[] args) {
  42.         Properties props = new Properties();
  43.         props.put(“bootstrap.servers”, “localhost:9092”);
  44.         props.put(“group.id”, “test-group”); 
  45.         props. Put (“key. deserializer”, “org.apache.kafka.common.serialization.StringDeserializer”);
  46.         Props. put(“value. deserializer”, “org.apache.kafka.common.serialization.StringDeserializer”);
  47.  
  48.         KafkaConsumer<String, String> consumer = new KafkaConsumer<>(props);
  49.         consumer.subscribe(Arrays.asList(“my-topic”));
  50.  
  51.         while (true) {
  52.             ConsumerRecords<String, String> records = consumer.poll(Duration.of Millis(100));
  53.             for (ConsumerRecord<String, String> record : records) {
  54.                 System. out.print(“Received message: key = %s, value = %s%n”, record.key(), record.value());
  55.             }
  56.         }
  57.     }
  58. }
  59. Use code 
  60. content_copy

Remember: Start a local Kafka server before running this example.

Next Steps

This example is the starting point. Kafka offers far more! Consider exploring:

  • Kafka Streams library for powerful stream processing
  • Spring Kafka for simplified integration with Spring projects
  • Error Handling, Security, and Advanced Configuration

 

 

You can find more information about  Apache Kafka  in this Apache Kafka

 

Conclusion:

Unogeeks is the No.1 IT Training Institute for Apache kafka Training. Anyone Disagree? Please drop in a comment

You can check out our other latest blogs on  Apache Kafka  here –  Apache kafka Blogs

You can check out our Best In Class Apache Kafka Details here –  Apache kafka Training

Follow & Connect with us:

———————————-

For Training inquiries:

Call/Whatsapp: +91 73960 33555

Mail us at: info@unogeeks.com

Our Website ➜ https://unogeeks.com

Follow us:

Instagram: https://www.instagram.com/unogeeks

Facebook: https://www.facebook.com/UnogeeksSoftwareTrainingInstitute

Twitter: https://twitter.com/unogeek


Share

Leave a Reply

Your email address will not be published. Required fields are marked *