alanmvarela / local_kafka_broker

A dockerized Kafka broker to easily test Kafka integrations locally.
MIT License
0 stars 0 forks source link
docker-compose kafdrop kafka kafka-consumer kafka-producer

Local Kafka Broker

A dockerized Kafka broker with just one node to easily test Kafka integrations locally.

This implementation is based on @Boyu1997 post: Intro to Kafka.

Tech Stack

Broker: Kafka, Zookeper, Kafdrop, DockerCompose

Client: confluent-kafka-python

Features

Run Locally

Clone the project

  git clone https://github.com/alanmvarela/local_kafka_broker.git

Go to the project directory

  cd local_kafka_broker

Start containers

  docker-compose up -d

Check that the 3 containers for the Zookeeper, Kafka and kafdrop are running

  docker ps

Stop Broker

To stop the broker run

  docker-compose stop

The clean up the previous execution data with command below if you want a fresh start on the next run.

  rm -rf data

Running Tests

To test the broker you can use the clients to produce and consumer some mock messages

First open the UI in your browser using below URL and create test_topic topic

  http://localhost:9000/

Run producer script to generate 10 new messages for test_topic topic. After running this script you should be able to see the new messages within the topic in the UI.

  python3 client/python/producer.py

Run consumer script to consume all messages queued for test_topic topic. After runnign this script you should be able to see that all messages within the topic in the UI had dissapeared.

  python3 client/python/consumer.py

Project Structure

.
├── client
│   └── python
│       ├── consumer.py
│       └── producer.py
├── docker-compose.yml
├── LICENSE
└── README.md

Lessons Learned

I learnt how to dockerize and use a local instance of Zookeper, Kafka and Kafdrop UI.

Also I was able to provide implementations a basic producer and consumer in python.

Roadmap

Authors