fluent / fluent-plugin-kafka

Kafka input and output plugin for Fluentd
Other
303 stars 176 forks source link

Output plugin kafka2 error: Kafka::BufferOverflow error="Cannot produce to XXX, max buffer size (1000 messages) reached #471

Closed outer199 closed 1 year ago

outer199 commented 1 year ago

Describe the bug

Output plugin kafka2 error: Kafka::BufferOverflow error="Cannot produce to XXX, max buffer size (1000 messages) reached

To Reproduce

...

Expected behavior

no error

Your Environment

- Fluentd version:1.14.3
- TD Agent version:4.3.0
- fluent-plugin-kafka version:0.18.0
- ruby-kafka version:1.4.0
- Operating system: Alibaba cloud linux
- Kernel version:4.18.0-240.22.1.el8_3.x86_64

Your Configuration

<match click>
  @type copy
  <store>
    @type kafka2
    <buffer tag>
      @type file
      path /var/log/td-agent/kafka-buffer/click
      flush_interval 5s
      max_buffer_size 10000
    </buffer>
    <format>
      @type json
    </format>
    topic click

    brokers xxxxxx
  </store>
</match>

Your Error Log

2022-10-16 11:15:22 +0800 [warn]: #0 failed to flush the buffer. retry_times=0 next_retry_time=2022-10-16 11:15:24 +0800 chunk="5eb1e4469ffc3fd1cc229451f88a3d58" error_class=Kafka::BufferOverflow error="Cannot produce to click, max buffer size (1000 messages) reached"
  2022-10-16 11:15:22 +0800 [warn]: #0 /opt/td-agent/lib/ruby/gems/2.7.0/gems/ruby-kafka-1.4.0/lib/kafka/producer.rb:525:in `buffer_overflow'
  2022-10-16 11:15:22 +0800 [warn]: #0 /opt/td-agent/lib/ruby/gems/2.7.0/gems/ruby-kafka-1.4.0/lib/kafka/producer.rb:210:in `produce'
  2022-10-16 11:15:22 +0800 [warn]: #0 /opt/td-agent/lib/ruby/gems/2.7.0/gems/fluent-plugin-kafka-0.18.0/lib/fluent/plugin/out_kafka2.rb:363:in `block in write'
  2022-10-16 11:15:22 +0800 [warn]: #0 /opt/td-agent/lib/ruby/gems/2.7.0/gems/fluentd-1.14.3/lib/fluent/event.rb:315:in `each'
  2022-10-16 11:15:22 +0800 [warn]: #0 /opt/td-agent/lib/ruby/gems/2.7.0/gems/fluentd-1.14.3/lib/fluent/event.rb:315:in `block in each'
  2022-10-16 11:15:22 +0800 [warn]: #0 /opt/td-agent/lib/ruby/gems/2.7.0/gems/fluentd-1.14.3/lib/fluent/plugin/buffer/file_chunk.rb:171:in `open'
  2022-10-16 11:15:22 +0800 [warn]: #0 /opt/td-agent/lib/ruby/gems/2.7.0/gems/fluentd-1.14.3/lib/fluent/event.rb:314:in `each'
  2022-10-16 11:15:22 +0800 [warn]: #0 /opt/td-agent/lib/ruby/gems/2.7.0/gems/fluent-plugin-kafka-0.18.0/lib/fluent/plugin/out_kafka2.rb:324:in `write'
  2022-10-16 11:15:22 +0800 [warn]: #0 /opt/td-agent/lib/ruby/gems/2.7.0/gems/fluentd-1.14.3/lib/fluent/plugin/output.rb:1179:in `try_flush'

Additional context

No response

raytung commented 1 year ago

Hey @outer199 thanks for reporting this. Do you mind upgrading the fluent-plugin-kafka plugin to 0.18.1 and see if that fixes this problem?

outer199 commented 1 year ago

Hey @outer199 thanks for reporting this. Do you mind upgrading the fluent-plugin-kafka plugin to 0.18.1 and see if that fixes this problem?

thanks for your reply. i use chunk_limit_records 980, it seems ok

raytung commented 1 year ago

@outer199 Am I correct to understand that this is no longer a problem? If so, feel free to close this issue.