Open mandy-chessell opened 1 year ago
Hmm. I thought we had fixed that.
It's not immediately obvious why that's happening. We're going to have to track down what's going on. Sorry about that!!
Can confirm this is still the case wih Lychee: https://securityheaders.com/?q=https%3A%2F%2Fgithub.com%2FLycheeOrg%2FLychee&followRedirects=on
@andrewfader - when you get a chance, can you try to see why this is happening?
Looks like the problem is the project website not the repo https://securityheaders.com/?q=https%3A%2F%2Fegeria-project.org%2F&followRedirects=on but it's displaying the wrong url in the message
Hmm, we want to make our error messages clear so they can be acted on. I think we should change:
// X-Content-Type-Options was not set to "nosniff"
into something like this:
// X-Content-Type-Options was not set to "nosniff" in least one of its relevant websites
But still the problem persist that both addresses (github release page + securityheaders) I provided actually have X-Content-Type-Options set to "nosniff". https://bestpractices.coreinfrastructure.org/en/projects/2855
We use Github : https://github.com/LycheeOrg/Lychee and https://securityheaders.com/?q=https%3A%2F%2Fgithub.com%2FLycheeOrg%2FLychee&followRedirects=on verifies that we are compliant. // X-Content-Type-Options was not set to "nosniff".
What would also be helpful would be to also know which address does not have those set.
https://securityheaders.com/?q=https%3A%2F%2Flycheeorg.github.io%2F&followRedirects=on shows red for X-Content-Type-Options
securityheaders.com/?q=https%3A%2F%2Flycheeorg.github.io%2F&followRedirects=on shows red for X-Content-Type-Options
I guess no github.io website will be marked as compliant then. :|
That's weird, GitHub used to be compliant!
I'm going to contact some friends at GitHub (via OpenSSF) to see what's going on. I suspect GitHub has changed some internal systems and accidentally dropped some security headers. Please stay tuned.
Now that we know what the problem is, my preference would be to convince GitHub to make a change so that the problem is fixed for everyone :-). Obviously I don't control GitHub, but hopefully they'll take a look. I've already contacted GitHub, let's see what we can do to resolve the problem for everyone.
Splitting sub-issue #1911 to address the error message
I talked to someone at GitHub, who agreed it made sense but didn't control the issue. That person did pass it on.
I can't promise anything right now, but we can cross fingers :-).
Here are a few added notes.,
I know GitHub knows the advantages of secure headers in HTTP. They even developed an OSS library specifically to help enable secure headers: https://github.com/github/secure_headers
I'm aware that enterprise GitHub allows this: https://docs.github.com/en/enterprise-server@3.6/admin/configuration/configuring-your-enterprise/configuring-github-pages-for-your-enterprise
But there doesn't seem to be an easy configuration option in "normal" GitHub pages.
There are workarounds with Cloudflare, Netlify, and/or Heroku, e.g.: https://scotthelme.co.uk/security-headers-cloudflare-worker/ https://github.com/antomor/security-headers-cloudflare-worker https://antomor.com/blog/security-headers-on-static-websites/ https://www.rzegocki.pl/blog/custom-http-headers-with-github-pages/ https://stackoverflow.com/questions/14798589/github-pages-http-headers
So that's a possibility without leaving GitHub pages.
That said, I'd rather make the secure option the default. We'll see what happens. GitHub laid off some folks, so that is going to pause many things right now :-(.
I've talked with GitHub. They're aware of this request, and acknowledge the need, and would love to build the feature. Which is great! Unfortunately it's not planned yet, & they don't have any date to share with us.
In the meantime I guess Pages users will need to continue using the workarounds. That's unfortunate, but I don't have better information.
One thing we could do is provide, in the details, a few notes about how to work around this. I hate to do that, because I'd rather it be the default, but that may be the best way to proceed for now. Thoughts?
One thing we could do is provide, in the details, a few notes about how to work around this. I hate to do that, because I'd rather it be the default, but that may be the best way to proceed for now. Thoughts?
IMHO, knowing that this issue will eventually be resolved is enough, it is pretty much just a wait game.
Waiting is the easy thing, especially since this is a gold level criterion, not a passing level criterion. I'm sad that it's hard to day. On the other hand, pressing to make it "secure by default" seems like the right place to go long-term.
Waiting is not ideal. One obvious question is, is this an important requirement? I did some research & thinking to see if nosniff is really important.
I did some research to see if 'nosniff' was the default on typical web browsers already (and thus isn't needed). Unfortunately, as best I can tell, it is not the default, presumably because too many sites fail to provide accurate file type information. So that's not a valid argument.
A better argument, however, involves what this is intended to prevent. The point of 'nosniff' is to prevent user-created content from being misinterpreted if the web server fails to provide the correct file type. So 'nosniff' can be vital on sites which serve user-created content (that is, the site serves untrusted data mixed into its other content). In such cases we really do want to require 'nosniff'.
However... in the case of GitHub pages, or sites like kernel.org, those are typically serving static pages generated solely by trusted processes (e.g., Jekyll on GitHub) and using only data controlled and trusted by the project - there's no untrusted data mixed in. In that case, which seems pretty common, 'nosniff' doesn't help. I'll also bet that most of those systems mark the file type anyway - and thus 'nosniff' will never be useful. So while there are systems were 'nosniff' is vital, we can probably define situations where it's okay to not have 'nosniff'. We'd have to craft a text change for this exception & have it reviewed. What's most important is making it very clear what the exception is, and why this exception will not harm the security of users.
What do you think?
Another approach would be white listing some domains which are known to publish static pages by trusted processes.
ildyria - excellent point! I hadn't thought of that. If we can be confident that (for example) github.io only serves static pages, then that would solve the case for many. I think that statement is correct, I'm not sure how to verify it but I imagine it's possible.
By itself that approach won't help the many OSS projects that use github.io as a host but use DNS to serve their own domain. But a tweak could solve that too - if the domain itself wasn't specially accepted, we could see if that domain has a CNAME record that points to something we accept (e.g., *.github.io). See the documentation on custom DNS for GitHub pages.
We'll need to verify that the claims are true - suggestions welcome.
Still, this sounds like an excellent idea. Thoughts anyone?
I think that would make sense. We should also consider allowing content on https://*.readthedocs.io
which also only serves static content.
I'm not sure this would work, but here's an interesting approach that might work as an alternative.
Add CSP To Github Pages points out that <META HTTP-EQUIV=....
in an HTML head region can provide HTTP headers within an HTML file. That might be a pain for people to implement, though.
Howdy - what is the latest here?
So any update's??
Another approach would be white listing some domains which are known to publish static pages by trusted processes.
That was a great idea
Hello, I am the project lead of the Egeria project. Our badge application is https://bestpractices.coreinfrastructure.org/en/projects/3044
We are struggling with the follow requirement
It is showing that our GitHub repository is failing because the X-Content-Type-Options do not include nosniff.
However, when we check it on securityheaders.io, it shows that it is correctly set up.
Our site seems to have the same settings are other websites showing as gold. What are we doing wrong? Any guidance would be welcome.
Similar issues have been raised in the past but not for some time: