SeriesKafka 101
1/3. Local Kafka with CLI, your first run2/3. Kafka consumer groups, explained3/3. Read Kafka with Spark Streaming
This post gets you running locally and proves Kafka works with the minimal CLI loop: create a topic, produce messages, and consume them. Ref: Kafka quickstart.
Downloads at the end: go to Downloads.
Quick takeaways
- Kafka is easiest to learn locally with Docker.
- CLI is enough to validate your setup.
- Once this works, you can integrate with Spark or Python.
Run it yourself
- Local Docker: main path for this blog.
- Databricks: not needed for this post.
| |
Links:
Create a topic
| |
Expected output (example):
Created topic demo-events.
Produce messages
| |
Type a few lines and press Enter.
Consume messages
| |
Expected output: You will see the lines you typed in the producer.
What to verify
- Messages you typed are visible in the consumer.
- The topic has the partition count you set.
- You can stop and restart the consumer without data loss.
Downloads
If you want to run this without copying code, download the notebook or the .py export.