Closed wylde780 closed 5 years ago
Thanks! I’ll check it out later this week and get back to you.
On Wed, Oct 2, 2019 at 00:16 wylde780 notifications@github.com wrote:
While trying to start fresh i've encountered some grok errors. It appears to me based on tags that the error occurs in 10-pf.conf. Below is an event without matching 05-syslog.conf. I'm not sure how to debug this further.
{ "host" => "10.42.0.1", "@Version https://github.com/Version" => "1", "type" => "syslog", "message" => "<134>Oct 1 22:13:51 filterlog: 5,,,1000000103,igb0.20,match,block,in,4,0x0,,113,10251,0,none,17,udp,132,83.248.107.164,161.184.221.253,8999,51412,112", "@timestamp https://github.com/timestamp" => 2019-10-02T04:13:51.412Z }
Below is an event when matching { "@timestamp https://github.com/timestamp" => 2019-10-02T04:10:34.782Z, "host" => "10.42.0.1", "tags" => [ [0] "pf", [1] "Ready", [2] "_grokparsefailure" ], "@Version https://github.com/Version" => "1", "event" => { "original" => "<134>Oct 1 22:10:34 filterlog: 7,,,1000000105,igb0.21,match,block,in,6,0x00,0x00000,1,UDP,17,438,fe80::7add:12ff:fe83:eba,ff02::c,60004,1900,438" }, "type" => "syslog" }
Thanks
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/a3ilson/pfelk/issues/38?email_source=notifications&email_token=AEA2HR7RBLKD627TL2C5GP3QMQODPA5CNFSM4I4RRH7KYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4HPAWSFA, or mute the thread https://github.com/notifications/unsubscribe-auth/AEA2HR6LNK6LLJDE3R4ZQHDQMQODPANCNFSM4I4RRH7A .
What is your logstash-plain.log say? I get tons of errors from latest confs/grok.
[2019-10-02T12:49:49,044][WARN ][logstash.filters.grok ] Grok regexp threw exception {:exception=>"Invalid FieldReference: ids_network[transport] :backtrace=>["org/logstash/ext/JrubyEventExtLibrary.java:85:in
get'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:365:in handle'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:343:in
block in match_against_groks'", "(eval):15:in block in compile_captures_func'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:202:in
capture'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:343:in block in match_against_groks'", "org/jruby/RubyArray.java:1792:in
each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:339:in match_against_groks'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:329:in
match'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:293:in block in filter'", "org/jruby/RubyHash.java:1419:in
each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:292:in filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:143:in
do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:162:in block in multi_filter'", "org/jruby/RubyArray.java:1792:in
each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:159:in multi_filter'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:115:in
multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:239:in block in start_workers'"], :class=>"RuntimeError"}
Not sure if related or not but this is also what I get in Kibana discovery:
tags:pf, Ready, _dateparsefailure, Suricata, _grokparsefailure ids_desc:ET INFO Session Traversal Utilities for NAT (STUN Binding Request) @timestamp:Oct 2, 2019 @ 12:49:48.915 syslog_timestamp:Oct 2 12:49:48 syslog_hostname:OPNsense.localdomain ids_class:Attempted User Privilege Gain ids_pri:1 syslog_program:suricata ids_sig_id:2016149 type:syslog syslog_message:[1:2016149:2] ET INFO Session Traversal Utilities for NAT (STUN Binding Request) [Classification: Attempted User Privilege Gain] [Priority: 1] {UDP} 172.16.66.5:63511 -> 139.59.84.212:3478 ids_sig_rev:2 host:192.168.0.1 ids_gen_id:1 syslog_pid:37956 received_at:2019-10-02T10:49:48.915Z event.original:<173>Oct 2 12:49:48 OPNsense.localdomain suricata[37956]: [1:2016149:2] ET INFO Session Traversal Utilities for NAT (STUN Binding Request) [Classification:
Thanks for providing the additional details. The first email you sent with the grok failure was related to an ipv6 . I’ll attempt to fix but limited, as I do not have/use ipv6. The second email appears to be from a suricate log. Would you be able to provide the raw log for the suricate grok error?
I’ll take a look at both later today and or when time permits this week.
Thanks
On Wed, Oct 2, 2019 at 06:56 xplizit notifications@github.com wrote:
What is your logstash-plain.log say? I get tons of errors from latest confs/grok.
[2019-10-02T12:49:49,044][WARN ][logstash.filters.grok ] Grok regexp threw exception {:exception=>"Invalid FieldReference: ids_network[transport] :backtrace=>["org/logstash/ext/JrubyEventExtLibrary.java:85:in get'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:365:in handle'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:343:in block in match_against_groks'", "(eval):15:in block in compile_captures_func'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/jls-grok-0.11.5/lib/grok-pure.rb:202:in capture'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:343:in block in match_against_groks'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:339:in match_against_groks'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:329:in match'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:293:in block in filter'", "org/jruby/RubyHash.java:1419:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-grok-4.1.1/lib/logstash/filters/grok.rb:292:in filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:143:in do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:162:in block in multi_filter'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:159:in multi_filter'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:115:in multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:239:in block in start_workers'"], :class=>"RuntimeError"}
Not sure if related or not but this is also what I get in Kibana discovery:
tags:pf, Ready, _dateparsefailure, Suricata, _grokparsefailure ids_desc:ET INFO Session Traversal Utilities for NAT (STUN Binding Request) @timestamp:Oct 2, 2019 @ 12:49:48.915 syslog_timestamp:Oct 2 12:49:48 syslog_hostname:OPNsense.localdomain ids_class:Attempted User Privilege Gain ids_pri:1 syslog_program:suricata ids_sig_id:2016149 type:syslog syslog_message:[1:2016149:2] ET INFO Session Traversal Utilities for NAT (STUN Binding Request) [Classification: Attempted User Privilege Gain] [Priority: 1] {UDP} 172.16.66.5:63511 -> 139.59.84.212:3478 ids_sig_rev:2 host:192.168.0.1 ids_gen_id:1 syslog_pid:37956 received_at:2019-10-02T10:49:48.915Z event.original:<173>Oct 2 12:49:48 OPNsense.localdomain suricata[37956]: [1:2016149:2] ET INFO Session Traversal Utilities for NAT (STUN Binding Request) [Classification:
— You are receiving this because you commented.
Reply to this email directly, view it on GitHub https://github.com/a3ilson/pfelk/issues/38?email_source=notifications&email_token=AEA2HR76MLG33NCJKGEYJV3QMR45RA5CNFSM4I4RRH7KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEAELHMA#issuecomment-537441200, or mute the thread https://github.com/notifications/unsubscribe-auth/AEA2HR33ODSTENDZ6O5SGNTQMR45RANCNFSM4I4RRH7A .
I'm totally noob with this, if not I would gladly help to fix everything since I would like to provide my client with latest opnsense setup I've sold them with ELK logs and dashboards. I'm tech savy but this is my first days with ELK so i'm tad confused and lost.
I can provide everything and anything I just need to be explained a bit where to find things. I can provide suricata log from opnsense, or any other part of ELK logs.
While trying to start fresh i've encountered some grok errors. It appears to me based on tags that the error occurs in 10-pf.conf. Below is an event without matching 05-syslog.conf. I'm not sure how to debug this further.
{ "host" => "10.42.0.1", "@Version" => "1", "type" => "syslog", "message" => "<134>Oct 1 22:13:51 filterlog: 5,,,1000000103,igb0.20,match,block,in,4,0x0,,113,10251,0,none,17,udp,132,83.248.107.164,161.184.221.253,8999,51412,112", "@timestamp" => 2019-10-02T04:13:51.412Z }
Below is an event when matching { "@timestamp" => 2019-10-02T04:10:34.782Z, "host" => "10.42.0.1", "tags" => [ [0] "pf", [1] "Ready", [2] "_grokparsefailure" ], "@Version" => "1", "event" => { "original" => "<134>Oct 1 22:10:34 filterlog: 7,,,1000000105,igb0.21,match,block,in,6,0x00,0x00000,1,UDP,17,438,fe80::7add:12ff:fe83:eba,ff02::c,60004,1900,438" }, "type" => "syslog" }
Thanks
I updated 05-syslog.conf and 10-pf.conf. Please update and give it another try. I took your original messages and applied them to the current GROK patterns...everything appeared to work.
-Andrew
Hrmm well, initial results are the same. I must be doing something wrong. I'll delete everything and start fresh again. 10-pf.conf doesnt not appear to be matching even though 'pf' is found in the tags. the <134> is syslog.pri ( i think ) is that causing an issue with the match?
{ "host" => "10.42.0.1", "@timestamp" => 2019-10-03T01:36:19.536Z, "@version" => "1", "tags" => [ [0] "pf", [1] "_grokparsefailure" ], "type" => "syslog", "event" => { "original" => "<134>Oct 2 19:36:19 filterlog: 5,,,1000000103,igb0.20,match,block,in,4,0x0,,115,18951,0,none,17,udp,58,74.137.107.150,162.184.221.253,45012,51412,38" } }
`
filter { if [type] == "syslog" {
if [host] =~ /10\.42\.0\.1/ {
mutate {
add_tag => ["pf", "Ready"]
}
}
if "Ready" not in [tags] {
mutate {
add_tag => [ "syslog" ]
}
}
} } filter { if [type] == "syslog" { mutate { remove_tag => "Ready" } } } ` Thanks
Are you using pfSense?
change line #5 within your 10-pf.conf to the following:
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
-Andrew
On Wed, Oct 2, 2019 at 9:41 PM wylde780 notifications@github.com wrote:
Hrmm well, initial results are the same. I must be doing something wrong. I'll delete everything and start fresh again. 10-pf.conf doesnt not appear to be matching even though 'pf' is found in the tags. the <134> is syslog.pri ( i think ) is that causing an issue with the match?
{ "host" => "10.42.0.1", "@timestamp https://github.com/timestamp" => 2019-10-03T01:36:19.536Z, "@Version https://github.com/Version" => "1", "tags" => [ [0] "pf", [1] "_grokparsefailure" ], "type" => "syslog", "event" => { "original" => "<134>Oct 2 19:36:19 filterlog: 5,,,1000000103,igb0.20,match,block,in,4,0x0,,115,18951,0,none,17,udp,58,74.137.107.150,162.184.221.253,45012,51412,38" } } 05-syslog.conf
filter { if [type] == "syslog" {
Adjust to match the IP address of pfSense or OPNSense
if [host] =~ /10.42.0.1/ { mutate { add_tag => ["pf", "Ready"] } } if [host] =~ /172.2.22.1/ { mutate { add_tag => ["pf-2", "Ready"] } }
if "Ready" not in [tags] { mutate { add_tag => [ "syslog" ] } }
} } filter { if [type] == "syslog" { mutate { remove_tag => "Ready" } } }
Thanks
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/a3ilson/pfelk/issues/38?email_source=notifications&email_token=AEA2HRYSKTZ7PDEGHW5G5HLQMVETXA5CNFSM4I4RRH7KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEAGWWYI#issuecomment-537750369, or mute the thread https://github.com/notifications/unsubscribe-auth/AEA2HR6PDW27YHWVOCTJUX3QMVETXANCNFSM4I4RRH7A .
@a3ilson The problem is that the grok pattern in 10-pf.conf is not correct. It expects the the syslog_hostname to be there but that field doesn't exist in firewall logs so it will never match.
{
"tags" => [
[0] "pf",
[1] "_grokparsefailure"
],
"@timestamp" => 2019-10-03T01:25:45.387Z,
"@version" => "1",
"host" => "192.168.0.210",
"event" => {
"original" => "<134>Oct 3 10:25:45 filterlog: 54,,,1569985359,vtnet1,match,block,in,4,0x0,,64,0,0,DF,17,udp,78,192.168.11.1,192.168.11.210,34800,137,58"
},
"type" => "syslog"
}
A quick fix would be to add a second filter that doesn't include the host name.
# 10-pf.conf
filter {
if "pf" in [tags] {
grok {
match => {
"message" => [
"%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}",
"%{SYSLOGTIMESTAMP:syslog_timestamp} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}"
]
}
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
mutate {
rename => { "[message]" => "[event][original]"}
}
}
}
Thank you both for your responses. Sjaak01 your fix appears to be working. Logs are being parsed once again.
updated 10-pf.conf
While trying to start fresh i've encountered some grok errors. It appears to me based on tags that the error occurs in 10-pf.conf. Below is an event without matching 05-syslog.conf. I'm not sure how to debug this further.
{ "host" => "10.42.0.1", "@version" => "1", "type" => "syslog", "message" => "<134>Oct 1 22:13:51 filterlog: 5,,,1000000103,igb0.20,match,block,in,4,0x0,,113,10251,0,none,17,udp,132,83.248.107.164,161.184.221.253,8999,51412,112", "@timestamp" => 2019-10-02T04:13:51.412Z }
Below is an event when matching { "@timestamp" => 2019-10-02T04:10:34.782Z, "host" => "10.42.0.1", "tags" => [ [0] "pf", [1] "Ready", [2] "_grokparsefailure" ], "@version" => "1", "event" => { "original" => "<134>Oct 1 22:10:34 filterlog: 7,,,1000000105,igb0.21,match,block,in,6,0x00,0x00000,1,UDP,17,438,fe80::7add:12ff:fe83:eba,ff02::c,60004,1900,438" }, "type" => "syslog" }
Thanks