logstash-plugins / logstash-input-beats

Apache License 2.0
87 stars 81 forks source link

Event transform fails with "no implicit conversion of String into Hash" error #498

Open JesseDocken opened 3 months ago

JesseDocken commented 3 months ago

Logstash information:

Please include the following information:

  1. Logstash version (e.g. bin/logstash --version) 8.14.1

  2. Logstash installation source (e.g. built from source, with a package manager: DEB/RPM, expanded from tar or zip archive, docker) Docker Hub image

  3. How is Logstash being run (e.g. as a service/service manager: systemd, upstart, etc. Via command line, docker/kubernetes) Docker

  4. How was the Logstash Plugin installed Prepackaged

JVM (e.g. java -version):

If the affected version of Logstash is 7.9 (or earlier), or if it is NOT using the bundled JDK or using the 'no-jdk' version in 7.10 (or higher), please provide the following information:

  1. JVM version (java -version)
  2. JVM installation source (e.g. from the Operating System's package manager, from source, etc).
  3. Value of the JAVA_HOME environment variable if set.

OS version (uname -a if on a Unix-like system): Linux x86_64 SMP

Description of the problem including expected versus actual behavior:

We're attempting to forward events from an application instance via a filebeat container to Logstash to do some mild transformation logic prior to being forwarded to ElasticSearch for proper indexing. The logstash.conf file is set up to specify a datastream namespace based on the name of the container the log event had assigned; if there is no container.name, we instead hardcode it to "unknown-source".

For some reason, when we route our application's filebeat to point to the new Logstash node, we get a regularly recurring error message as follows:

[2024-08-08T18:35:39,429][INFO ][org.logstash.beats.BeatsHandler][main][21f2e638fa2eab3590e5958bd310dee6be34e576f07271b2734b4fad55773e7d] [local: 172.18.0.3:5044, remote: 172.31.96.123:42310] Handling exception: org.jruby.exceptions.TypeError: (TypeError) no implicit conversion of String into Hash (caused by: org.jruby.exceptions.TypeError: (TypeError) no implicit conversion of String into Hash)
[2024-08-08T18:35:39,429][WARN ][io.netty.channel.DefaultChannelPipeline][main][21f2e638fa2eab3590e5958bd310dee6be34e576f07271b2734b4fad55773e7d] An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means the last handler in the pipeline did not handle the exception.
org.jruby.exceptions.TypeError: (TypeError) no implicit conversion of String into Hash
        at org.jruby.RubyHash.merge!(org/jruby/RubyHash.java:2120) ~[jruby.jar:?]
        at org.jruby.RubyHash.merge(org/jruby/RubyHash.java:2152) ~[jruby.jar:?]
        at usr.share.logstash.vendor.bundle.jruby.$3_dot_1_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_8_dot_3_minus_java.lib.logstash.inputs.beats.decoded_event_transform.transform(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-input-beats-6.8.3-java/lib/logstash/inputs/beats/decoded_event_transform.rb:16) ~[?:?]
        at org.jruby.RubyHash.each(org/jruby/RubyHash.java:1610) ~[jruby.jar:?]
        at org.jruby.java.proxies.MapJavaProxy.each(org/jruby/java/proxies/MapJavaProxy.java:581) ~[jruby.jar:?]
        at usr.share.logstash.vendor.bundle.jruby.$3_dot_1_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_8_dot_3_minus_java.lib.logstash.inputs.beats.decoded_event_transform.transform(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-input-beats-6.8.3-java/lib/logstash/inputs/beats/decoded_event_transform.rb:12) ~[?:?]
        at usr.share.logstash.vendor.bundle.jruby.$3_dot_1_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_8_dot_3_minus_java.lib.logstash.inputs.beats.codec_callback_listener.process_event(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-input-beats-6.8.3-java/lib/logstash/inputs/beats/codec_callback_listener.rb:22) ~[?:?]
        at usr.share.logstash.vendor.bundle.jruby.$3_dot_1_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_8_dot_3_minus_java.lib.logstash.inputs.beats.patch.accept(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-input-beats-6.8.3-java/lib/logstash/inputs/beats/patch.rb:10) ~[?:?]
        at usr.share.logstash.vendor.bundle.jruby.$3_dot_1_dot_0.gems.logstash_minus_codec_minus_plain_minus_3_dot_1_dot_0.lib.logstash.codecs.plain.decode(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-codec-plain-3.1.0/lib/logstash/codecs/plain.rb:54) ~[?:?]
        at usr.share.logstash.vendor.bundle.jruby.$3_dot_1_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_8_dot_3_minus_java.lib.logstash.inputs.beats.patch.accept(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-input-beats-6.8.3-java/lib/logstash/inputs/beats/patch.rb:9) ~[?:?]
        at usr.share.logstash.vendor.jruby.lib.ruby.stdlib.delegate.method_missing(/usr/share/logstash/vendor/jruby/lib/ruby/stdlib/delegate.rb:87) ~[?:?]
        at usr.share.logstash.vendor.bundle.jruby.$3_dot_1_dot_0.gems.logstash_minus_input_minus_beats_minus_6_dot_8_dot_3_minus_java.lib.logstash.inputs.beats.message_listener.onNewMessage(/usr/share/logstash/vendor/bundle/jruby/3.1.0/gems/logstash-input-beats-6.8.3-java/lib/logstash/inputs/beats/message_listener.rb:52) ~[?:?]
Caused by: org.jruby.exceptions.TypeError: (TypeError) no implicit conversion of String into Hash
        ... 12 more
Caused by: org.jruby.exceptions.TypeError: (TypeError) no implicit conversion of String into Hash
        ... 12 more
(repeats many, many more times)

Steps to reproduce:

Please include a minimal but complete recreation of the problem, including (e.g.) pipeline definition(s), settings, locale, etc. The easier you make for us to reproduce it, the more likely that somebody will take the time to look at it.

  1. Set up a simple three-node Docker compose with filebeat, logstash, and ElasticSearch.
  2. Configure filebeat to use the following config file:
    
    ---

filebeat.inputs:

output.logstash: hosts: [":5044"]

http: enabled: true host: 0.0.0.0 port: 5066

 3. Configure logstash to use the following pipeline file:

input { beats { port => 5044 } http { port => 5045 } } filter { age {} if [@metadata][age] > 86400 { drop {} } } filter { if [container][name] { mutate { add_field => { "[data_stream][namespace]" => "%{[container][name]}" } } } else { mutate { add_field => { "[data_stream][namespace]" => "unknown-source" } add_field => { "_missing" => "container.name" } } } } filter { mutate { remove_field => [ "agent", "[container][labels]", "log", "tags", "stream", "input", "@version", "[cloud][account][id]", "[cloud][machine][type]", "[cloud][provider]", "[cloud][service][name]", "ecs" ] } } output { elasticsearch { hosts => "" user => "" password => "" data_stream => "true" data_stream_type => "logs" data_stream_dataset => "app" } }


4. Have the application begin writing logs. (It's unknown what logs are triggering this; any guidelines on seeing what log event is being ingested at the time of exception would be welcome.)

**Provide logs (if relevant)**: