Kafka Example Python
Downloading and Working with Apache Kafka: A Python Primer
Apache Kafka has become ubiquitous in modern data architectures for its power in handling real-time data streams. If you’re a Python developer looking to integrate Kafka into your projects, let’s start by outlining installation and providing some basic usage examples.
Getting Started
Prerequisites:
- Java: Kafka is built on the JVM, so you’ll need to have Java installed on your system. You can find it at
- Python: Make sure you have a working version of Python installed (ideally Python 3).
Downloading Kafka:
- Visit the official Apache Kafka download page grab the latest binary release. Unzip the downloaded file to your preferred location.
Installing Python Client:
The most common way to interact with Kafka from Python is using the
kafka-python
library. Install it using pip:Bashpip install kafka-python
Use code
Basic Kafka Setup
Starting Zookeeper: Kafka relies on Zookeeper for coordination tasks. Inside your unzipped Kafka directory, navigate to the
bin
folder and run:Bash./zookeeper-server-start.sh ../config/zookeeper.properties
Use codeStarting Kafka Server: Open a separate terminal window, still within the
bin
folder, and execute:Bash./kafka-server-start.sh ../config/server.properties
Use code
Python Producer Example
Let’s write a simple Python producer to send messages to a Kafka topic.
from kafka import KafkaProducer
producer = KafkaProducer(bootstrap_servers='localhost:9092') # Connect to Kafka
topic = 'my-kafka-topic'
for i in range(10):
message = f"Test message {i}"
producer.send(topic, message.encode('utf-8'))
producer.flush() # Ensure messages are sent
Python Consumer Example
Now, let’s create a consumer to read those messages:
from kafka import KafkaConsumer
consumer = KafkaConsumer('my-kafka-topic', bootstrap_servers='localhost:9092')
for message in consumer:
print(f"Received: {message.value.decode('utf-8')}")
Key Points to Note
- Topics: Kafka organizes data streams into topics (think of these like categories).
- Brokers: Kafka brokers are the individual servers in a Kafka cluster.
- Producers write data to topics.
- Consumers subscribe to topics to read data.
Beyond the Basics
This is just a taste of working with Kafka in Python. You can explore:
- Confluent Client: The
confluent-kafka-python
library offers additional features and is maintained by Confluent, a company founded by the creators of Kafka. - Complex Data Formats: Learn how to handle structured data using Avro schema serialization.
- Consumer Groups: Scale consumption by coordinating consumer instances within groups.
Conclusion:
Unogeeks is the No.1 IT Training Institute for Apache kafka Training. Anyone Disagree? Please drop in a comment
You can check out our other latest blogs on Apache Kafka here – Apache kafka Blogs
You can check out our Best In Class Apache Kafka Details here – Apache kafka Training
Follow & Connect with us:
———————————-
For Training inquiries:
Call/Whatsapp: +91 73960 33555
Mail us at: info@unogeeks.com
Our Website ➜ https://unogeeks.com
Follow us:
Instagram: https://www.instagram.com/unogeeks
Facebook: https://www.facebook.com/UnogeeksSoftwareTrainingInstitute
Twitter: https://twitter.com/unogeek