Apache Kafka Setup with Docker + Python Producer & Consumer
This project sets up Apache Kafka using Docker containers and demonstrates a Producer-Consumer workflow in Python.
The producer (producer.py) generates transaction data, while the consumer (app.py) consumes the data from a Kafka topic.
📌 Prerequisites
Before you start, ensure you have the following installed:
- Docker
- Docker Compose
- Python 3.x
- Python Kafka client library:
pip install kafka-python📂 Project Structure
.
├── docker-compose.yml # Docker Compose file for Kafka + ZooKeeper
├── producer.py # Kafka producer that generates transactions
├── app.py # Kafka consumer that reads transactions
└── README.md # Setup guide
🚀 Kafka Setup Instructions
1. Navigate to Project Directory
Make sure you are in the folder that contains the docker-compose.yml file:
cd /path/to/your/project2. Start Kafka & ZooKeeper Containers
docker-compose up -d-
-druns the containers in the background. -
This starts:
- ZooKeeper (required by Kafka)
- Kafka Broker
3. Verify Running Containers
docker psYou should see containers for kafka and zookeeper.
📡 Kafka Topic Setup
1. Access Kafka Container
docker exec -it <kafka-container-name> bash2. Create a Topic
bin/kafka-topics.sh --create \
--topic transactions \
--bootstrap-server localhost:9092 \
--partitions 1 \
--replication-factor 13. List Topics
bin/kafka-topics.sh --list --bootstrap-server localhost:9092You should see transactions in the list.
📝 Running Producer & Consumer
1. Run Producer (producer.py)
This script generates transaction data and sends it to the Kafka topic transactions.
python producer.pyYou should see messages being sent to Kafka.
2. Run Consumer (app.py)
This script consumes messages from the transactions topic and displays/uses them.
python app.pyYou should see messages being received in real time.
🛑 Stopping Services
Stop Kafka and ZooKeeper:
docker-compose downRestart later with:
docker-compose up -d⚙️ Notes
-
Kafka default ports:
- 9092 → Kafka broker
- 2181 → ZooKeeper
-
Both
producer.pyandapp.pyare configured to use the topictransactions.
If you change the topic name or Kafka connection settings, update the scripts accordingly. -
Logs can be checked using:
docker logs <kafka-container-name>✅ With this setup, you can:
- Run Kafka & ZooKeeper inside Docker
- Produce transaction data via
producer.py - Consume transaction data via
app.py