Closed whwright closed 10 months ago
Does by any chance the topic contain existing messages, produced by an other producer ? The consumer auto reset to "earliest" so it will consume the topic from the beginning, and in Kafka, the broker sends messages as they are stored, as they want to delegate most of the CPU work to the clients. So you could face :
Does by any chance the topic contain existing messages, produced by an other producer ?
No this is not possible, I am running this example on a brand new topic where my event in the script is the first and only one published.
Do you have access to the segment files on the broker ? If yes, you can check the compression with a kafka tools provided in the kafka binaries. Maybe there are 3rd party tools also offer that while consuming a topic. In aiokafka, it seems the error you have should happen when reading messages with header compression saying "snappy" :
I found out that the topic I was creating was being created with compression.type: "snappy"
without me providing that config to the API call. I am using a shared cluster, so I suspect that is the cluster's default in some way.
I re-creating my topic using compression.type: "producer"
and my example above now works.
Thank you for your help, sorry it ended up being a dumb mistake!
Describe the bug When trying to run a very basic producer/consumer I am getting
for events that are produced without any compression type defined.
Expected behaviour When no compression type is defined I don't need snappy to consumer events
Environment (please complete the following information):
python -c "import aiokafka; print(aiokafka.__version__)"
): 0.8.1python -c "import kafka; print(kafka.__version__)"
): 2.0.2kafka-topics.sh --version
): 2.8.0Reproducible example
and the output