fluent / fluent-plugin-kafka

Kafka input and output plugin for Fluentd
Other
303 stars 178 forks source link

Every buffer configuration returns Kafka::MessageSizeTooLarge or chunk bytes limit exceeds for an emitted event stream #473

Closed gpedras closed 1 year ago

gpedras commented 1 year ago

Describe the bug

Doesn't matter which configuration I use, I'll always get either error:

I'm trying to issue the records to Kafka but it doesn't seem to work and I've tried almost every buffer combination and it feels like the configuration is being ignored because nothing seems to change.

To Reproduce

I'm using Kafka 2.0.0 with 4 distinct brokers in the same network. If I use compression_codec the error is "chunk bytes limit exceeds for an emitted event stream:". Otherwise if I omit it the error is "error_class=Kafka::MessageSizeTooLarge error="Kafka::MessageSizeTooLarge"".

I also changed the chunk_limit_size for different values, as low as 100 bytes but the error is still "Kafka::MessageSizeTooLarge". I've seen a solution where someone used "chunk_limit_records", but it doesn't work for me either

Expected behavior

The messages should be inserted successfully without any errors

Your Environment

- Calyptia-Fluentd version: 1.4.2
- Fluentd version: 1.15.3
- fluent-plugin-kafka version: 0.18.1
- ruby-kafka version: 1.5.0
- Operating system: Ubuntu 20.04 LTS
- Kernel version: 5.4.0-110-generic

Your Configuration

<match xxxxxxxxx>
        @type kafka2
        brokers XXXXX
        default_topic sampletopic
        compression_codec gzip
        <buffer tag>
                @type file
                path /var/log/calyptia-fluentd/xxxxxxxxx
                chunk_limit_size 100000
                chunk_limit_records 100
        </buffer>
        <format>
                @type json
        </format>
</match>

Your Error Log

2022-11-30 16:10:20 +0000 [warn]: #0 failed to flush the buffer. retry_times=0 next_retry_time=2022-11-30 16:10:21 +0000 chunk="5eeb25739e0c8dbe9af771f0620a31c6" error_class=Kafka::MessageSizeTooLarge error="Kafka::MessageSizeTooLarge"
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol.rb:160:in `handle_error'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:153:in `block in handle_response'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol/produce_response.rb:36:in `block (2 levels) in each_partition'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol/produce_response.rb:35:in `each'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol/produce_response.rb:35:in `block in each_partition'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol/produce_response.rb:34:in `each'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol/produce_response.rb:34:in `each_partition'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:144:in `handle_response'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:133:in `block in send_buffered_messages'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:105:in `each'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:105:in `send_buffered_messages'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:62:in `block in execute'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/instrumenter.rb:23:in `instrument'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:53:in `execute'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluent-plugin-kafka-0.18.1/lib/fluent/plugin/kafka_producer_ext.rb:213:in `block in deliver_messages_with_retries'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluent-plugin-kafka-0.18.1/lib/fluent/plugin/kafka_producer_ext.rb:203:in `loop'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluent-plugin-kafka-0.18.1/lib/fluent/plugin/kafka_producer_ext.rb:203:in `deliver_messages_with_retries'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluent-plugin-kafka-0.18.1/lib/fluent/plugin/kafka_producer_ext.rb:129:in `deliver_messages'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluent-plugin-kafka-0.18.1/lib/fluent/plugin/out_kafka2.rb:377:in `write'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluentd-1.15.3/lib/fluent/plugin/output.rb:1180:in `try_flush'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluentd-1.15.3/lib/fluent/plugin/output.rb:1501:in `flush_thread_run'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluentd-1.15.3/lib/fluent/plugin/output.rb:501:in `block (2 levels) in start'
  2022-11-30 16:10:20 +0000 [warn]: #0 /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluentd-1.15.3/lib/fluent/plugin_helper/thread.rb:78:in `block in thread_create'
2022-11-30 16:10:21 +0000 [warn]: #0 Send exception occurred: Kafka::MessageSizeTooLarge
2022-11-30 16:10:21 +0000 [warn]: #0 Exception Backtrace : /opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol.rb:160:in `handle_error'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:153:in `block in handle_response'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol/produce_response.rb:36:in `block (2 levels) in each_partition'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol/produce_response.rb:35:in `each'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol/produce_response.rb:35:in `block in each_partition'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol/produce_response.rb:34:in `each'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/protocol/produce_response.rb:34:in `each_partition'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:144:in `handle_response'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:133:in `block in send_buffered_messages'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:105:in `each'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:105:in `send_buffered_messages'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:62:in `block in execute'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/instrumenter.rb:23:in `instrument'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/ruby-kafka-1.5.0/lib/kafka/produce_operation.rb:53:in `execute'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluent-plugin-kafka-0.18.1/lib/fluent/plugin/kafka_producer_ext.rb:213:in `block in deliver_messages_with_retries'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluent-plugin-kafka-0.18.1/lib/fluent/plugin/kafka_producer_ext.rb:203:in `loop'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluent-plugin-kafka-0.18.1/lib/fluent/plugin/kafka_producer_ext.rb:203:in `deliver_messages_with_retries'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluent-plugin-kafka-0.18.1/lib/fluent/plugin/kafka_producer_ext.rb:129:in `deliver_messages'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluent-plugin-kafka-0.18.1/lib/fluent/plugin/out_kafka2.rb:377:in `write'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluentd-1.15.3/lib/fluent/plugin/output.rb:1180:in `try_flush'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluentd-1.15.3/lib/fluent/plugin/output.rb:1501:in `flush_thread_run'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluentd-1.15.3/lib/fluent/plugin/output.rb:501:in `block (2 levels) in start'
/opt/calyptia-fluentd/lib/ruby/gems/3.0.0/gems/fluentd-1.15.3/lib/fluent/plugin_helper/thread.rb:78:in `block in thread_create'
2022-11-30 16:10:21 +0000 [info]: #0 initialized kafka producer: fluentd
2022-11-30 16:10:21 +0000 [warn]: #0 failed to flush the buffer. retry_times=1 next_retry_time=2022-11-30 16:10:24 +0000 chunk="5eeb25739e0c8dbe9af771f0620a31c6" error_class=Kafka::MessageSizeTooLarge error="Kafka::MessageSizeTooLarge"
  2022-11-30 16:10:21 +0000 [warn]: #0 suppressed same stacktrace
2022-11-30 16:10:24 +0000 [warn]: #0 Send exception occurred: Kafka::MessageSizeTooLarge

AND

2022-11-30 17:59:13 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 1023550bytes
2022-11-30 17:59:21 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 4430883bytes
2022-11-30 17:59:22 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 391734bytes
2022-11-30 17:59:37 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 4395495bytes
2022-11-30 17:59:56 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 177426bytes
2022-11-30 18:00:07 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 217934bytes
2022-11-30 18:00:14 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 1351232bytes
2022-11-30 18:00:21 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 3500976bytes
2022-11-30 18:00:23 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 817739bytes
2022-11-30 18:01:08 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 8119801bytes
2022-11-30 18:01:19 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 1562523bytes
2022-11-30 18:01:28 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 874708bytes
2022-11-30 18:01:37 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 3982204bytes
2022-11-30 18:01:40 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 171648bytes
2022-11-30 18:01:48 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 8046157bytes
2022-11-30 18:02:19 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 127904bytes
2022-11-30 18:02:21 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 8149307bytes
2022-11-30 18:02:25 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 1454418bytes
2022-11-30 18:02:34 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 667629bytes
2022-11-30 18:03:00 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 8131106bytes
2022-11-30 18:03:24 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 796411bytes
2022-11-30 18:03:25 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 241876bytes
2022-11-30 18:03:33 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 528361bytes
2022-11-30 18:03:37 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 4048362bytes
2022-11-30 18:03:46 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 8044984bytes
2022-11-30 18:04:25 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 1289043bytes
2022-11-30 18:04:30 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 111606bytes
2022-11-30 18:04:34 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 515742bytes
2022-11-30 18:04:49 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 7197481bytes
2022-11-30 18:05:25 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 762249bytes
2022-11-30 18:05:34 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 440504bytes
2022-11-30 18:05:35 +0000 [warn]: #0 chunk size limit exceeds for an emitted event stream: 184records
2022-11-30 18:05:37 +0000 [warn]: #0 chunk bytes limit exceeds for an emitted event stream: 3392831bytes

Additional context

No response

github-actions[bot] commented 1 year ago

This issue has been automatically marked as stale because it has been open 90 days with no activity. Remove stale label or comment or this issue will be closed in 30 days

github-actions[bot] commented 1 year ago

This issue was automatically closed because of stale in 30 days