logstash-plugins / logstash-filter-date

Apache License 2.0
7 stars 43 forks source link

Exception thrown due to dates with microseconds #69

Closed excalq closed 8 years ago

excalq commented 8 years ago

I'm attempting to ingest logs into our ELK stack, and these logs, coming from an in-house Ruby app have the time in this format: 2016-09-28T15:38:57.302000-0700. Even though I've tried to add a custom filter using the date plugin to handle ISO8601, ss.SSSSSS, ss.SSS???, or ss.SSS000, they all fail with the same exception, as follows:

{
  :timestamp=>"2016-09-28T15:38:57.302000-0700", 
  :message=>"Failed parsing date from field", 
  :field=>"time", 
  :value=># <BigDecimal:534a09ec,'0.1312003838E1',10(12)>, 
  :exception=>"could not coerce BigDecimal to class java.lang.String", 
  :config_parsers=>"YYYY-MM-DD HH:mm:ss Z,YYYY-MM-DD'T'HH:mm:ss.SSS000Z,ISO8601",   :config_locale=>"en", :level=>:warn
}

Also, I have no problem with simply dropping the microsecond portion of the timestamp. Millisecond precision is plenty. I'm using Logstash 2.4.0, and this entry is forwarded by Filebeat 1.2.3

excalq commented 8 years ago

Interestingly this microsecond format comes from the Ruby 2.1 StdLib logger, as a default: See https://github.com/ruby/ruby/blob/ruby_2_1/lib/logger.rb#L514

excalq commented 8 years ago

Hmm, this was my fault. Apparently some log entries of our app have a format of {"status":"finished","time":0.343048734}, which shows runtime, not a timestamp. Sorry for the false alarm (if this does prove to be the cause of said issue).