Closed tdram closed 8 years ago
As documented in the README, the logstash config needs two input fields: program
and message
. The input plugin that imports your log lines into logstash does not define the program
field, and the message
field has too much data.
Which logstash input plugin do you use? It looks like the file input plugin, but I'm not sure (never used it, but the tags seem to indicate it). Maybe the docs on alternative inputs can help: https://github.com/whyscream/postfix-grok-patterns/blob/master/ALTERNATIVE-INPUTS.md
Thanks for the reply :)
I use logstash forwarder to ship logs.
Quoting the logstash forwarder docs:
The filebeat project replaces logstash-forwarder. Please use that instead.
Did you try the instructions on filebeat in the alternative inputs doc?
Hi, I just tried using filebeats. When I use the alternative input method,this is the error I get.
{:timestamp=>"2016-06-24T03:40:16.709000-0700", :message=>"fetched an invalid config", :config=>"input {\n beats {\n port => 5044\n ssl => true\n s sl_certificate => \"/etc/pki/tls/certs/logstash-forwarder.crt\"\n ssl_key => \"/etc/pki/tls/private/logstash-forwarder.key\"\n }\n}\n\ngrok {\n match => { \"message\" => \"%{SYSLOGTIMESTAMP} %{SYSLOGHOST} %{DATA:program}(?:\\[%{POSINT}\\])?: %{GREEDYDATA}\" }\n}\n\nfilter {\n # grok log lines by program name (listed alpabetically)\n if [program] =~ /^postfix.*\\/anvil$/ {\n grok {\n patterns_dir => \"/etc/logstash/patterns.d\"\n match => [ \"message\", \"%{POSTFIX_ANVIL}\" ]\n tag_on_failure => [ \"_grok_postfix_anvil_nomatch\" ]\n add_tag => [ \"_grok_postfix_success\" ]\n }\n } else if [program] =~ /^postfix.*\\/bounce$/ {\n grok {\n patterns_dir => \"/etc/logsta sh/p
.......
oat\",\n \"postfix_postscreen_violation_time\", \"float\"\n ]\n }\n}\n\noutput {\n elasticsearch {\n hosts => [\"localhost:9200\"]\n sniffing => true\n manage_template => false\n index => \"%{[@metadata][beat]}-%{+YYYY.MM.dd}\"\n document_type => \"%{[@metadata][type]}\"\n }\n}\n\n", :reason=>"Expected one of #, input, filter, output at line 10, column 1 (byte 187) after ", :level=>:error}
Any help? I'm stuck at this point.
Thanks.
Got it worked. This thread helped me https://github.com/whyscream/postfix-grok-patterns/issues/79#issuecomment-167950407
so, I didn't create separate file like 49-pre-filter.conf . I just edited 50-filter-postfix.conf and as he suggested in that thread, I entered the alternative method patch before first if statement and it worked.
Thank for your amazing work :)
Thanks for the report. When I have time, I'll verify the instructions, and update the readme accordingly (if necessary).
I'm trying to filter postfix logs using the grok patterns provided.
The thing is, its not filtering the postfix logs properly. It is not segregating the message field which contains various fields like FROM, TO , HOST, STATUS, NCRPT , etc.
The output just comes as
"message" => "Jun 16 00:00:01 serverhost postfix/qmgr[2337]: 9B6G21E2221: from=user@domain.com, size=6273, nrcpt=1 (queue active)", "@version" => "1", "@timestamp" => "2016-06-23T13:25:21.958Z", "type" => "log", "file" => "maillog.1", "host" => "localhost.localdomain", "offset" => "541"
The message comes as it is without being filtered.
I'm still using logstash and elastic search, kibana are all latest stable release.
Can any one help?