hotosm / tech

Resources and issue tracking for Technical Working Group and all things Tech within HOT. Start here to get more information about how to get involved in HOT.
68 stars 13 forks source link

Upgrading HTTPS security on HOT websites #127

Open nukeador opened 3 years ago

nukeador commented 3 years ago

Hi,

I've noticed that some HOT sites are scoring really low on the HTTPS observatory. What this means is that they might be target for different attack vectors and techniques (XSS, Man-in-the-middle...)

https://observatory.mozilla.org/analyze/tasks.hotosm.org https://observatory.mozilla.org/analyze/www.hotosm.org

It would be good to also run this over other sites under HOT's control.

Cheers.

nukeador commented 3 years ago

Talking with @willemarcel it seems you faced some challenges implementing this, so I wanted to share some tips from my personal experience implementing the recommendations on other sites:

  1. Try to implement all recommendations EXCEPT CSP. This will boost the security of the site to B, which is very important to avoid non-secure connections to the site or its resources or XSS and click-jacking attacks.

  2. During a second phase, play with the different CSP options, starting with the more permissive one and moving to the most restrictive one in different phases:

https://infosec.mozilla.org/guidelines/web_security#content-security-policy

For this there are at least 3 important factors: Have a list of external urls where you load resources from (or try to self-host most libraries to improve privacy) and analyze how much inline js and css you use on your code (to try to plan to move these to a file instead).

Please, ping me if you have other questions where my experience can be helpful.

Cheers.

/cc @eternaltyro @dakotabenjamin

eternaltyro commented 3 years ago

@nukeador Thank you for the guidelines. There are multiple challenges in play here.

Try to implement all recommendations EXCEPT CSP. This will boost the security of the site to B, which is very important to avoid non-secure connections to the site or its resources or XSS and click-jacking attacks.

The HOT website is hosted on Github pages which, sadly, does not allow setting HTTP headers. So we are restricted to http <meta> tags for setting these security flags. Meta tags as you may know are not supported by all browsers and is not considered standard.

During a second phase, play with the different CSP options, starting with the more permissive one and moving to the most restrictive one in different phases:

Missing any domains in the CSP whitelist causes drastic problems with functionality. Therefore, I had to know every single resource included in the HOT website. I used a browser tool that generates CSP policies and loaded (what I thought was) all the pages so that the tool can capture a list of resources being loaded. I overestimated the coverage here and the implementation broke the website and several blogposts.

I'm trying to now find tools (or build a simple one myself) that read the sitemap and visit all the pages in the sitemap to build a list of resources.

Alternatively, I could use the content-security-policy-report-only header and use a tool that parses reports to the report-uri. Unfortunately, report-only tag can't be passed through <meta> tags.

In the end, it seems like using the sitemap.xml to crawl all pages and gather resources seems like the most reliable way to go. I'll keep this ticket updated as I make progress.

nukeador commented 3 years ago

The HOT website is hosted on Github pages which, sadly, does not allow setting HTTP headers. So we are restricted to http <meta> tags for setting these security flags. Meta tags as you may know are not supported by all browsers and is not considered standard.

Do you have specific browsers in mind? According to this table all major browsers have full compatibility with the main CSP attributes that the observatory recommends to change:

https://developer.mozilla.org/en-US/docs/Web/HTTP/CSP#browser_compatibility

My experience implementing CSP thought meta tags on github pages recently didn't resulted into compatibility issues.

Missing any domains in the CSP whitelist causes drastic problems with functionality. Therefore, I had to know every single resource included in the HOT website. I used a browser tool that generates CSP policies and loaded (what I thought was) all the pages so that the tool can capture a list of resources being loaded. I overestimated the coverage here and the implementation broke the website and several blogposts. In the end, it seems like using the sitemap.xml to crawl all pages and gather resources seems like the most reliable way to go. I'll keep this ticket updated as I make progress.

This can be quite complex yes, specially if you want optimize for any existing external resource, that's why doing CSP in a second phase is advised.

I would say that the security recommendations is to have control over external resources loaded, so allowing anyone to add images from any domain might be a policy change to consider. What other sites have done is to fetch and copy locally all these images to avoid the burden of having a huge allow list.

This also applies to external css and js, where most of the times the most productive approach is to copy them locally to have full control on the code that is executed.