Open cameronkerrnz opened 6 years ago
Check #17
I have the same problem, codec does't work in output:
org.jruby.exceptions.NoMethodError: (NoMethodError) undefined method
to_java_bytes' for #<#
Just ran into this now as well.
@cgiraldo, I see your PR has been open for a long time now. Anything we can do to get it merged in and released?
Thanks for the patch @cgiraldo , I built and installed this and my system is running perfectly again.
Same problem here. I use logstash-7.3.2.
@cgiraldo @sleighzy is there a way to build / install a fix for this issue?
Fixed in 1.2.0
There isn't a 1.2.0!
There is not a 1.2.0 release in github, but there is in rubygems:
https://rubygems.org/gems/logstash-codec-avro_schema_registry/versions/1.2.0
Thanks @cgiraldo. We'd like to make changes to support Apicurio, so where's the 1.2.0 source code?
I think It is the master branch, @ryananguiano?
I have the latest 1.2.0 release. I am encountering the same error in logstash when i have binary_encoded => true
[2021-09-13T14:34:33,680][ERROR][logstash.javapipeline ][main] Pipeline worker error, the pipeline will be stopped {:pipeline_id=>"main", :error=>"(NoMethodError) undefined method `to_java_bytes' for #<#<Class:0x38aa6237>:0x32adac90>", :exception=>Java::OrgJrubyExceptions::NoMethodError
@cgiraldo @ryananguiano Regarding the previoius comment by @choykalun, he built his gem from the latest master branch and did not download it from rubygems. Version 1.2.0 from rubygems was built from latest in master, right?
Hello, I'm doing a bit of development with this codec (thank you for making this available, much appreciated). I struck an issue that may be due to change in logstash.
With logstash 6.3 (I haven't tried other versions), I get the following stack-trace
This appears to occur because of the following code in avro_schema_registry.rb
/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-codec-avro_schema_registry-1.1.0/lib/logstash/codecs/avro_schema_registry.rb
This points to a to_java_bytes method, but this seems to cause issues with kafka.rb
(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-kafka-7.0.10/lib/logstash/outputs/kafka.rb)
So
data
here is presumably already a to_java_bytes.... and is why jruby is complaining that it doesn't have to_java_bytes method.Changing avro_schema_registry.rb to pass in buffer.string instead of buffer.string.to_java_bytes seems to fix this problem, and I get the results I am expecting.
Disclaimer: I am doing some custom development on your plugin to support my own use-case (its not in any public repo at this time, too early). In my use-case, I have an AVRO schema that I use to encapsulate the event in a logstash reception tier. I have various fields that are used in a backbone format, and the event goes into a 'message' field (and could be json, some binary payload, etc.). That said, the same behaviour occurs when I use avro_schema_registry.rb as per your current HEAD.
Possibly related to https://github.com/logstash-plugins/logstash-output-kafka/issues/123
Here is my configuration and testing; after fixing the issue.
My logstash output configuration:
Current version of schema (will change 'message' to bytes soon).
Test input:
[vagrant@node-1 ~]$ echo "{\"breath_in\": \"$(date --iso-8601=ns)\"}" | nc 127.0.0.1 5140
Output when viewed using kafka-avro-console-consumer:
{"submission_time":0,"submitted_from":"SUBMITTED_FROM","originating_host":"ORIGINATING_HOST","vertical":"VERTICAL","environment":"ENVIRONMENT","processing_key":"PROCESSING_KEY","message_format":"json","message":"{\"host\":\"node-1\",\"@timestamp\":\"2018-06-18T01:27:59.221Z\",\"breath_in\":\"2018-06-18T13:27:59,081679157+1200\",\"@version\":\"1\",\"port\":48710}"}