bpaquet / node-logstash

Simple logstash implmentation in nodejs : file log collection, sent with zeromq
Other
517 stars 141 forks source link

Timestamp not captured with regex filter (date_format) #132

Closed jerome83136 closed 8 years ago

jerome83136 commented 8 years ago

Hello,

I use node-logstash to process my Apache access_logs which begins by this:

192.54.145.*** - - [24/Jun/2016:14:05:13 +0200] "GET ... [...]

I have read the documentation, but I can't get my timestamp processed correctly. My logs are injected into Elasticsearch but the timestamp is lost.

I'm using the regex filter to apply a pattern on my logs and to extract fields.

I have tried to specify the date_format but no luck.

It works with grok with that config:

 grok {
  extra_patterns_file => '/conf/logstash/patterns.grok'
  match => '\[%{SHORTTIME:timestamp}\]'
  date_format => ['dd/MMM/yyyy:HH:mm:ss ZZ']
 }

NB: SHORTTIME in /conf/logstash/patterns.grok contains this --> SHORTTIME (?:.*)

How can I have the equivalent of the grok config, but with the regex filter ?

Thanks for your help Jérôme

bpaquet commented 8 years ago
regex {
  regex => /\[([^\]+])\]/
  fields => timestamp
  date_format => ['dd/MMM/yyyy:HH:mm:ss ZZ']
}

I agree, it's not a very fun syntax :)

jerome83136 commented 8 years ago

Hello,

I had to change the regex to match my timestamp; this config is working for me:

 regex {
  regex => /^.*\s\[(.*)\]\s\"/
  fields => timestamp
  date_format => ['dd/MMM/yyyy:HH:mm:ss ZZ']
 }

Thank you for your help.

Best regards Jérôme