Open dynamike67 opened 8 years ago
@dynamike67 can you give us a sample event to check this? Thanks
logstash-forwarder.conf: { "paths": [ "/var/log/br3_coffeemachine_8000/access-*.log" ], "fields": { "type": "accesslog" } }
logstash pipline: input { lumberjack { port => 5043 type => "logs" ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt" ssl_key => "/etc/pki/tls/private/logstash-forwarder.key" } } filter {
if [type] == "syslog" { grok { match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" } add_field => [ "received_at", "%{@timestamp}" ] add_field => [ "received_from", "%{host}" ] } syslog_pri { } date { match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ] } }
} filter {
if [type] == 'accesslog' { grok{ match => {'message' => "%{COMBINEDAPACHELOG} - %{NUMBER:responseTime} ms"} }
kv {
source => 'request'
field_split => "&?"
}
urldecode{
all_fields => true
}
mutate{
split => {
'search_text' => " "
'tags' => "+"
}
}
date{
match => ['timestamp' , "dd/MMM/yyyy:HH:mm:ss Z"]
}
}
} output { elasticsearch { hosts => ["10.65.249.29:9200"] } stdout { codec => rubydebug } }
logstash.stdout { "message" => "::ffff:10.65.162.198 - - [10/Nov/2015:07:37:45 +0000] \"GET /_lbhealth HTTP/1.0\" 200 16 \"-\" \"-\" - 0.556 ms", "@version" => "1", "@timestamp" => "2015-11-10T07:37:45.000Z", "type" => "accesslog", "file" => "/var/log/br3_coffeemachine_8000/access-20151110.log", "host" => "br3-ext-qs-2.mm.br.de", "offset" => "1969408", "clientip" => "::ffff:10.65.162.198", "ident" => "-", "auth" => "-", "timestamp" => "10/Nov/2015:07:37:45 +0000", "verb" => "GET", "request" => "/_lbhealth", "httpversion" => "1.0", "response" => "200", "bytes" => "16", "referrer" => "\"-\"", "agent" => "\"-\"", "responseTime" => "0.556" }
original logfile: ::ffff:10.65.162.198 - - [10/Nov/2015:07:37:45 +0000] "GET /_lbhealth HTTP/1.0" 200 16 "-" "-" - 0.556 ms
Logstash works well with syslog filter.
Kind Regards Michael
A changed my filter to: filter {
if [type] == 'accesslog' { grok{ match => {'message' => "%{COMBINEDAPACHELOG} - %{NUMBER:responseTime} ms"} }
date{
match => ['timestamp' , "dd/MMM/yyyy:HH:mm:ss Z"]
}
}
}
Now it runs perfet. logstash does not stop working wirh an error. Maybe the urldecode case the problem?
Kind Regards Michael
Same issue here.
OS: ubuntu server 14.04.3 Logstash version: 2.0.0 Java version:
java version "1.8.0_66"
Java(TM) SE Runtime Environment (build 1.8.0_66-b17)
Java HotSpot(TM) 64-Bit Server VM (build 25.66-b17, mixed mode)
/var/log/logstash/logstash.err
RuntimeError: can't add a new key into hash during iteration
[]= at org/jruby/RubyHash.java:992
filter at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-throttle-2.0.2/lib/logstash/filters/throttle.rb:186
multi_filter at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/filters/base.rb:152
each at org/jruby/RubyArray.java:1613
multi_filter at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/filters/base.rb:149
cond_func_38 at (eval):884
each at org/jruby/RubyArray.java:1613
cond_func_38 at (eval):881
filter_func at (eval):333
filterworker at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/pipeline.rb:219
start_filters at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/pipeline.rb:154
But I have no more about which event caused this issue. I will set log level to debug and try again.
{:timestamp=>"2016-10-20T12:41:26.899000+0000", :message=>"Exception in inputworker", "exception"=>#<RuntimeError: can't add a new key into hash during iteratio n>, "backtrace"=>["org/jruby/RubyHash.java:992:in
[]='", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.1.1/lib/logstash/codecs/netflow/u
til.rb:268:in ttl'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.1.1/lib/logstash/codecs/netflow/util.rb:242:in
[]='", "/opt/logstash
/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.1.1/lib/logstash/codecs/netflow.rb:362:in decode_ipfix'", "org/jruby/RubyKernel.java:1242:in
catch'", "
/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.1.1/lib/logstash/codecs/netflow.rb:334:in decode_ipfix'", "/opt/logstash/vendor/bundle/jrub y/1.9/gems/bindata-2.3.1/lib/bindata/array.rb:208:in
each'", "org/jruby/RubyArray.java:1613:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.3 .1/lib/bindata/array.rb:208:in
each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.1.1/lib/logstash/codecs/netflow.rb:333:in decode_i pfix'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.1.1/lib/logstash/codecs/netflow.rb:163:in
decode'", "/opt/logstash/vendor/bundle/j
ruby/1.9/gems/bindata-2.3.1/lib/bindata/array.rb:208:in each'", "org/jruby/RubyArray.java:1613:in
each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/bindata-
2.3.1/lib/bindata/array.rb:208:in each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-2.1.1/lib/logstash/codecs/netflow.rb:162:in
decod
e'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-2.0.5/lib/logstash/inputs/udp.rb:96:in inputworker'", "/opt/logstash/vendor/bundle/jruby/1. 9/gems/logstash-input-udp-2.0.5/lib/logstash/inputs/udp.rb:73:in
udp_listener'"], :level=>:error}
{:timestamp=>"2016-10-20T12:41:26.921000+0000", :message=>"Invalid netflow packet received (value '0' not as expected for obj.records[1].flowset_id)", :level=>:
warn}`
OS: CentOS 7.1 Logstash: 2.0.0 Java: OpenJDK 1.8.0_65
Filter: filter {
if [type] == 'accesslog' { grok{ match => {'message' => "%{COMBINEDAPACHELOG} - %{NUMBER:responseTime} ms"} }
}
}
logstash.err: RuntimeError: can't add a new key into hash during iteration []= at org/jruby/RubyHash.java:992 find_or_create_target at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/util/accessors.rb:119 each at org/jruby/RubyArray.java:1613 inject at org/jruby/RubyEnumerable.java:852 find_or_create_target at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/util/accessors.rb:119 lookup_or_create at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/util/accessors.rb:98 set at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/util/accessors.rb:63 []= at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/event.rb:146 filter at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-urldecode-2.0.2/lib/logstash/filters/urldecode.rb:36 each at org/jruby/RubyHash.java:1342 filter at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-urldecode-2.0.2/lib/logstash/filters/urldecode.rb:36 multi_filter at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/filters/base.rb:152 each at org/jruby/RubyArray.java:1613 multi_filter at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/filters/base.rb:149 cond_func_5 at (eval):326 each at org/jruby/RubyArray.java:1613 cond_func_5 at (eval):321 filter_func at (eval):239 filterworker at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/pipeline.rb:219 start_filters at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.0.0-java/lib/logstash/pipeline.rb:154
Kind Regards Michael