dpkp / kafka-python

Python client for Apache Kafka
http://kafka-python.readthedocs.io/
Apache License 2.0
5.59k stars 1.4k forks source link

Silent MessageSizeTooLargeError #841

Closed lopuhin closed 7 years ago

lopuhin commented 7 years ago

When trying to send a message that is larger than maximum allowed size (max_request_size argument of the KafkaProducer), the message is silently dropped, and only debug logs show that is was dropped:

DEBUG:kafka.producer.sender:Starting Kafka producer I/O thread.
DEBUG:kafka.producer.kafka:Kafka producer started
DEBUG:kafka.producer.kafka:Exception occurred during message send: [Error 10] MessageSizeTooLargeError: The message is 13561908 bytes when serialized which is larger than the maximum request size you have configured with the max_request_size configuration
DEBUG:kafka.producer.kafka:Flushing accumulated records in producer.

The code that sends the message is just

producer = KafkaProducer()
producer.send(topic, data)
producer.flush()

I would expect an exception to be thrown. I'm new to kafka so I'm not sure if my expectations are reasonable.

dpkp commented 7 years ago

KafkaProducer.send returns a future. To resolve, try:

future = producer.send(topic, data) result = future.get(timeout=123)

Exception will be raised by get()

lopuhin commented 7 years ago

Right, I didn't notice it, thanks for making it clear! The exception does happen with get().

rajish commented 5 years ago

Still I think the error should be reported at least at WARN level.

sksethi25 commented 3 years ago

@rajish I was facing same issue, i am thinking to use future.failed() method to check if it failed for this issue. I dnt think its going block the response till message is sent to kafka unlinke get method. @dpkp you can clarify this, is it okay to use failed method for this?