Closed gpedras closed 1 year ago
This issue has been automatically marked as stale because it has been open 90 days with no activity. Remove stale label or comment or this issue will be closed in 30 days
This issue was automatically closed because of stale in 30 days
Describe the bug
Doesn't matter which configuration I use, I'll always get either error:
I'm trying to issue the records to Kafka but it doesn't seem to work and I've tried almost every buffer combination and it feels like the configuration is being ignored because nothing seems to change.
To Reproduce
I'm using Kafka 2.0.0 with 4 distinct brokers in the same network. If I use compression_codec the error is "chunk bytes limit exceeds for an emitted event stream:". Otherwise if I omit it the error is "error_class=Kafka::MessageSizeTooLarge error="Kafka::MessageSizeTooLarge"".
I also changed the chunk_limit_size for different values, as low as 100 bytes but the error is still "Kafka::MessageSizeTooLarge". I've seen a solution where someone used "chunk_limit_records", but it doesn't work for me either
Expected behavior
The messages should be inserted successfully without any errors
Your Environment
Your Configuration
Your Error Log
Additional context
No response