Closed keeyonghan closed 10 years ago
I think this is an oversight see the attached issue.
Thanks for the quick reply! I don't get how to fix this problem. You are saying that client_id needs to be removed from the conf file?
Nah it's a problem with the underlying jruby-kafka library which recently had it's producer refactored. I'll try to push the fix tonight. On Sep 15, 2014 6:57 PM, "keeyonghan" notifications@github.com wrote:
Thanks for the quick reply! I don't get how to fix this problem. You are saying that client_id needs to be removed from the conf file?
— Reply to this email directly or view it on GitHub https://github.com/joekiller/logstash-kafka/issues/36#issuecomment-55671610 .
ok. Thanks! Looking forward to it.
Once https://github.com/elasticsearch/logstash/pull/1739 is accepted, this will be fixed.
Try rebuilding with the latest logstash. This should be fixed.
I am getting ClassNotFoundException error when I ran logstash command like the following:
$ sudo bin/logstash agent -f logstash-kafka.conf ... WARNING: Unknown configuration key: topic.id log4j:WARN No appenders could be found for logger (kafka.utils.VerifiableProperties). log4j:WARN Please initialize the log4j system properly. Exception in thread ">output" java.lang.ClassNotFoundException: at java.lang.Class.forName0(Native Method) at java.lang.Class.forName(java/lang/Class.java:190) at kafka.utils.Utils$.createObject(Utils.scala:438) at kafka.utils.Utils$.createObject(kafka/utils/Utils.scala:438) at kafka.producer.Producer.(Producer.scala:62)
at kafka.producer.Producer.(kafka/producer/Producer.scala:62)
at kafka.javaapi.producer.Producer.(Producer.scala:26)
at kafka.javaapi.producer.Producer.(kafka/javaapi/producer/Producer.scala:26)
at java.lang.reflect.Constructor.newInstance(java/lang/reflect/Constructor.java:526)
at RUBY.connect(/mnt/bin/logstash-1.4.2/vendor/bundle/jruby/1.9/gems/jruby-kafka-0.1.2-java/lib/jruby-kafka/producer.rb:69)
at RUBY.register(/mnt/bin/logstash-1.4.2/lib/logstash/outputs/kafka.rb:59)
at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)
at RUBY.outputworker(/mnt/bin/logstash-1.4.2/lib/logstash/pipeline.rb:220)
at RUBY.start_outputs(/mnt/bin/logstash-1.4.2/lib/logstash/pipeline.rb:152)
at java.lang.Thread.run(java/lang/Thread.java:745)
Any idea what went wrong? The content of logstash-kafka.conf is like this:
input { syslog { type => syslog } }
output { stdout { } kafka { topic_id => "test.syslog" compression_codec => "snappy" request_required_acks => 1 producer_type => "async" retry_backoff_ms => 1000 client_id => "test.syslog" } }
Thanks!