lexelby / logstash-filter-concatenate

like multiline, except easier to use and with flushing enabled
0 stars 0 forks source link

"RuntimeError: can't add a new key into hash during iteration" #2

Open timukas opened 10 years ago

timukas commented 10 years ago

Hello, With pretty basic concatenate filter setup:

concatenate { key => "%{uniqt}" min_flush_time => 10 max_flush_time => 5 }

i'm getting pretty often following errors:

Exception in filterworker {"exception"=>#<RuntimeError: can't add a new key into hash during iteration>, "backtrace"=>["org/jruby/RubyHash.java:986:in []='", "/home/logstash/beeeta/logstash/lib/logstash/filters/concatenate.rb:64:infilter'", "(eval):264:in initialize'", "org/jruby/RubyProc.java:271:incall'", "/home/logstash/beeeta/logstash/lib/logstash/pipeline.rb:262:in filter'", "/home/logstash/beeeta/logstash/lib/logstash/pipeline.rb:203:infilterworker'", "/home/logstash/beeeta/logstash/lib/logstash/pipeline.rb:143:in `start_filters'"], :level=>:error}

After an error, logstash hangs and do not process messages any more. Have to kill it and start again.

CPU & RAM is not an issue in my case - 16x3Ghz cores, 32 GB of memory and fast SSD disks. Amount of events also not very huge - around 100-200 per second.

Logstash version - 1.4.0. beta2 (tried with other versions of logstash as well - same error)

$ jruby -v jruby 1.7.11 (1.9.3p392)

$ java -version java version "1.7.0"

Any ideas on that error? What could be wrong?

regards,

appplemac commented 10 years ago

I think the cause could be the absence of synchronisation between the worker and the flusher thread, see koendc/logstash@cacd0036b6a499a470773d43bd9d3714f8776d25 for more context.