Closed digigek closed 10 years ago
I'm not sure why it isn't working. As you said it appears to be implemented correctly. I guess we need a test in the underlying jruby-kafka library to confirm it's functionality.
Pull requests are always welcome :)
This should be fixed now.
I've tested this with the latest version of jruby kafka by setting the max retries to 10 instead of 3 and got the following:
Java::KafkaCommon::FailedToSendMessageException: Failed to send messages after 10 tries.
Hi,
I have the following configuration for a kafka output in my logstash. Some time ago, I have complained that the message.send.max.retries property was missing for the Kafka producer in the plugin. After it has been introduced, I tried to use it but still I am hitting the default number of 3.
kafka { broker_list => "broker:9092,broker:9093" topic_id => "test" request_required_acks => 1 message_send_max_retries => 1000 retry_backoff_ms => 500 }
If I shut down completely my Kafka cluster and I start the above output, then I get the following message in the log.
{:timestamp=>"2014-07-16T09:58:58.191000+0200", :message=>"kafka producer threw exception, restarting", :exception=>#<KafkaError: Got FailedToSendMessageException: Failed to send messages after 3 tries.>, :level=>:warn}
So, it seems that my configured value of 500 for message_send_max_retries is not taken into accounr nor is the value of retry_backoff_ms. Can you please help me with this? I checked the code as well and the property seems to be passed into the producer options. Can it be a problem witht the underlying jruby-kafka library?