This post gets you running locally and proves Kafka works with the minimal CLI loop: create a topic, produce messages, and consume them. Ref: Kafka quickstart.

Downloads at the end: go to Downloads.

Quick takeaways

  • Kafka is easiest to learn locally with Docker.
  • CLI is enough to validate your setup.
  • Once this works, you can integrate with Spark or Python.

Run it yourself

  • Local Docker: main path for this blog.
  • Databricks: not needed for this post.
1
docker compose up

Links:


Create a topic

1
kafka-topics.sh --create --topic demo-events --bootstrap-server localhost:9092 --partitions 3 --replication-factor 1

Expected output (example):

Created topic demo-events.

Produce messages

1
kafka-console-producer.sh --topic demo-events --bootstrap-server localhost:9092

Type a few lines and press Enter.


Consume messages

1
kafka-console-consumer.sh --topic demo-events --from-beginning --bootstrap-server localhost:9092

Expected output: You will see the lines you typed in the producer.


What to verify

  • Messages you typed are visible in the consumer.
  • The topic has the partition count you set.
  • You can stop and restart the consumer without data loss.

Downloads

If you want to run this without copying code, download the notebook or the .py export.