logstash-plugins / logstash-filter-date

Apache License 2.0
7 stars 43 forks source link

_dateparsefailure only on specific value 2019-03-10 02:00:00 #129

Closed nchanmala closed 5 years ago

nchanmala commented 5 years ago

create a file with the following value:

"2019-03-10 02:00:00"

Logstash config:

input { beats { port => "5046" } }

filter { csv {

 columns => ["UsageEndDate"]
    separator => ","

}

date { match => ["UsageEndDate", "yyyy-MM-dd HH:mm:ss"] timezone => "America/New_York" target => "newEndDate" }

}

output { stdout { codec => rubydebug }

}

After kick off logstash, I am keep gettting _dateparsefailure:

{ "@timestamp" => 2019-04-08T18:08:49.064Z, "offset" => 23, "@version" => "1", "input_type" => "log", "beat" => { "name" => "kibana", "hostname" => "kibana", "version" => "5.6.3" }, "host" => "kibana", "UsageEndDate" => "2019-03-10 02:00:00", "source" => "/mypath/aws-nc2.test", "message" => "\"2019-03-10 02:00:00\"", "type" => "log", "fields" => { "index" => "testing_aws" }, "tags" => [ [0] "beats_input_codec_plain_applied", [1] "_dateparsefailure" ] }

I had no issue if the data is 2019-03-10 01:00:00

or any other hours, but only had issue with 02:00:00.

Thanks,

Noah

wiibaa commented 5 years ago

Hello, this specific timestamp does not exist due to Daylight saving time switch. So the underneath library is failing (http://joda-time.sourceforge.net/faq.html#illegalinstant) and a date parse failure is reported.

If it is something you found in your input log, I would suppose an issue regarding DST or timezone setting in the program that wrote this timestamp to begin with. HTH

nchanmala commented 5 years ago

Thanks Wiibaa. I did some more looking and found out the source data was in UTC. I changed my timezone=> "UTC" and the problem went away.

nchanmala commented 5 years ago

this issue solved.