f5devcentral / f5-waf-elk-dashboards

Apache License 2.0
51 stars 47 forks source link

App-Protect Beta, ELK not recognising double-quotes escape #2

Closed ysazman closed 4 years ago

ysazman commented 4 years ago
    Product: NGINX App Protect

App-Protect Beta, ELK not recognizing double-quotes escape

464d41 commented 4 years ago

Please provide request sample which causes such issue.

LY19191919 commented 4 years ago

A sample request is:

curl -i 'http://a.juicebox12345.com:8080?id=81 OR 1=1'

It should trigger syslog messages that looks like the snippet below (notice the use of " to escape "):

policy_name="/Common/policy1",protocol="HTTP",request="GET /?id=81 OR 1=1 HTTP/1.1\r\nUser-Agent: curl/7.29.0\r\nHost: a.juicebox12345.com:8080\r\nAccept: /\r\n",request_status="blocked",response_code="0",severity="Critical",sig_cves="N/A",sig_ids="200002147,200002835,200002476",sig_names="SQL-INJ expressions like ""or 1=1"" (3),SQL-INJ expressions like ""OR 1=1"" (7) (Parameter),SQL-INJ expressions like ""or 1=1"" (6) (Parameter)",sig_set_names="{Generic Detection Signatures;SQL Injection Signatures},{Generic Detection Signatures;SQL Injection Signatures},{Generic Detection Signatures;SQL Injection Signatures}"

On ELK, the sig_names field are broken into 3 fields as a result:

sig_names "SQL-INJ expressions like ""or 1=1"" (3)

?SQL-INJ expressions like ""OR 1 1"" (7) (Parameter)

?SQL-INJ expressions like ""or 1 1"" (6) (Parameter)"

464d41 commented 4 years ago

What kind of log message format do you use? This setup currently supports only "splunk" format.

LY19191919 commented 4 years ago

App-Protect default log format which is CSV i believe. So just to clarify, ELK only currently supports splunk format?

Thanks.

464d41 commented 4 years ago

That is correct. Currently logstash which receives logs from App-Protect and feeds them to Elasticsearch expects them in "splunk" format.

sgdavidw commented 4 years ago

I was using the default format withthe following configuration in log-default.json:

{
 "filter": {
     "request_type": "all"
 },
 "content": {
     "format": "default",
     "max_request_size": "any",
     "max_message_size": "5k"
 }
 }

I added the following lines into /logstash/conf.d/30-waf-logs-full-logstash.conf file,

input {
  syslog {
    port => 5144
  }
}
filter {
 mutate {
 gsub => ["message","\"\"","\""]
 }
  kv {
    field_split => ","
  }

It looks to me that replacing "" with \" fixed the " escape character issue, sig_names can be parsed and showed in Kabinan correctly. But I am not sure if this change would break something esle.

aknot242 commented 4 years ago

@464d41 the logstash config assumes a CSV output, not a Splunk format. See that the default log format for NAP is CSV.

aknot242 commented 4 years ago

NAP 1.3 (the latest release as of this date) may have addressed this issue when the quoting was changed. Please try to reproduce this with an attack like this: curl "yourhostname/v=|%20uname%20-a%20&%20users" --verbose

464d41 commented 4 years ago

curl "yourhostname/v=|%20uname%20-a%20&%20users" --verbose

Seem to work. Don't see any artifacts of parsing.

Screen Shot 2020-08-20 at 10 53 21 AM
aknot242 commented 4 years ago

PR #6 appears to have addressed this issue. Closing.