Open jvelaz opened 3 years ago
Hello, I keep getting errors when trying to connect to my Schema Registry. I think it can't handle well the exception the server gives.
Logstash.javapipeline ][main] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"main", :error=>"(ArgumentError) wrong number of arguments (given 1, expected 2)", :exception=>Java::OrgJrubyExceptions::ArgumentError, :backtrace=>["org.jruby.RubyException.exception(org/jruby/RubyException.java:129)", "C_3a_.datos.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.schema_registry_minus_0_dot_1_dot_1.lib.schema_registry.client.request(C:/datos/logstash/vendor/bundle/jruby/2.5.0/gems/schema_registry-0.1.1/lib/schema_registry/client.rb:127)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.net.http.start(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/net/http.rb:914)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.net.http.start(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/net/http.rb:609)", "RUBY.request(C:/datos/logstash/vendor/bundle/jruby/2.5.0/gems/schema_registry-0.1.1/lib/schema_registry/client.rb:101)", "RUBY.version(C:/datos/logstash/vendor/bundle/jruby/2.5.0/gems/schema_registry-0.1.1/lib/schema_registry/subject.rb:30)", "RUBY.get_write_schema_id(C:/datos/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-avro_schema_registry-1.2.0/lib/logstash/codecs/avro_schema_registry.rb:189)", ... ...
In my case I'm trying to output into kafka serializing with Avro:
output { file { path => "C:/datos/logstash/out/logfile.json" codec => "json_lines" } kafka { acks => "1" client_id => "My client" bootstrap_servers => "my servers" topic_id => "my topic" compression_type => "gzip" retries => 0 codec => avro_schema_registry { endpoint=> "schema registry URI" subject_name => "the AVRO schema in SR" schema_version => "5" } value_serializer => "org.apache.kafka.common.serialization.ByteArraySerializer" }
Hello, I keep getting errors when trying to connect to my Schema Registry. I think it can't handle well the exception the server gives.
Logstash.javapipeline ][main] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"main", :error=>"(ArgumentError) wrong number of arguments (given 1, expected 2)", :exception=>Java::OrgJrubyExceptions::ArgumentError, :backtrace=>["org.jruby.RubyException.exception(org/jruby/RubyException.java:129)", "C_3a_.datos.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.schema_registry_minus_0_dot_1_dot_1.lib.schema_registry.client.request(C:/datos/logstash/vendor/bundle/jruby/2.5.0/gems/schema_registry-0.1.1/lib/schema_registry/client.rb:127)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.net.http.start(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/net/http.rb:914)", "uri_3a_classloader_3a_.META_minus_INF.jruby_dot_home.lib.ruby.stdlib.net.http.start(uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/net/http.rb:609)", "RUBY.request(C:/datos/logstash/vendor/bundle/jruby/2.5.0/gems/schema_registry-0.1.1/lib/schema_registry/client.rb:101)", "RUBY.version(C:/datos/logstash/vendor/bundle/jruby/2.5.0/gems/schema_registry-0.1.1/lib/schema_registry/subject.rb:30)", "RUBY.get_write_schema_id(C:/datos/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-avro_schema_registry-1.2.0/lib/logstash/codecs/avro_schema_registry.rb:189)", ... ...
In my case I'm trying to output into kafka serializing with Avro: