Open ngkogkos opened 4 years ago
Here is a bash script I wrote for this very purpose. Feel free to use it, improve it, etc.
#!/usr/bin/bash
curl -v -silent https://$1 --stderr - | awk '/^content-security-policy:/' | grep -Eo "[a-zA-Z0-9./?=_-]*" | sed -e '/\./!d' -e '/[^A-Za-z0-9._-]/d' -e 's/^\.//' | sort -u
Sample output:
root@x:~# ./csp.sh hackerone.com
cover-photos.hackerone-user-content.com
errors.hackerone.net
hackathon-photos.hackerone-user-content.com
hackerone-us-west-2-production-attachments.s3.us-west-2.amazonaws.com
profile-photos.hackerone-user-content.com
www.google-analytics.com
www.youtube-nocookie.com
This should have been closed as per #681 AFAICS, shouldn't it?
Hum, after looking at the MR's changes there is no mention of Content-Security-Policy
there (also no luck with a repo-wide search).
So does the MR enables scraping domain names from all response headers, or am I missing something here?
Hello,
The
-active
flag enables Amass to grab SSL certs and look for subdomains in them. I was thinking that this flag could also enable Amass to grab the HTTP rersponse and analyze the HTTP headers in the response for more subdomains.A great example in this case are the CSP headers:
Of course organisations could return all sorts of HTTP headers, including custom ones, so I guess a regex to attempt and find and validate subdomains from HTTP responses (based on HTTP services found for
-ports
) could do the job.Please let me know what you think :).