Open AndreaBencini90 opened 8 months ago
Can you confirm that you are using long and not int for the type? I can see it is trying to deserialize time in milliseconds.
Can you please provide the schema information and the message?
the problem happend when the cosumer try to deserialize this value -9223370327508000000 on the topic. Obviusly this is an error of the data and not of the library. Since the consumer encounters an error, it prevents consuming the rest of the message. In my opinion, it would be nice to have an option to handle these situations. The user should be able to choose whether to let it break and throw an exception or not convert the data field where a similar issue is observed.
I seem to be getting this error as well when we define the date in milliseconds to be 865716973869987, which is the date Sat Jun 25 29403 09:11:09. It seems that fastavro is using the datetime library from python to parse these dates, and I don't know if I read correctly, but it seems that datetime only supports dates until datetime.date(9999, 12, 31).
Description
I'm encountering an OverflowError when attempting to deserialize messages using the confluent_kafka Avro deserializer in Python. Here's a simplified version of my code:
self.serilizer is AvroDeserializer object from onfluent_kafka.schema_registry.avro
when i call the self.deserializer(msg_value, None)
How to reproduce
confluent_avro 1.8.0 confluent-kafka 2.3.0 fastavro 1.9.2 kafka-python 2.0.2
[ ] Operating system: windows Checklist
Please provide the following information:
[ ] confluent-kafka-python and librdkafka version (
confluent_kafka.version()
andconfluent_kafka.libversion()
):[ ] Apache Kafka broker version:
[ ] Client configuration:
{...}
[ ] Operating system:
[ ] Provide client logs (with
'debug': '..'
as necessary)[ ] Provide broker log excerpts
[ ] Critical issue