Closed ghost closed 8 years ago
Use HTTPS for what though?
For Github Pages powered sites I just route it through Cloudflare. Enable HTTPS with them and add a page rule to redirect to HTTPS.
While there is a manual installation procedure, the advertised method requires piping code from a non-https source directly into your shell, which needless to say plenty of people, find it displeasing and I have to say, I, myself, agree with them. :smile:
Totally.
So what I've done in the past is:
my-project.tld
pointing to project.github.io
(Cloudflare is one of a few that does CNAME flattening at the root for you making this work)http://*my-project.tld/*
traffic to HTTPSAs an added bonus this makes it also possible to turn on IPv6 and DNSSEC for your domain. Unfortunately Github doesn't do IPv6 yet so that fails a bit :stuck_out_tongue:.
@daenney I hate cloudflare, it always prompt me a captcha as I use tor.
I hope that fisherman.sh doesn't use cloudflare as it might be problematic for someone else.
I hope you can explain why do we need to use cloudflare.
@daenney I hate cloudflare, it always prompt me a captcha as I use tor.
That is entirely up to the people hosting their domains on Cloudflare to decide. Just adjust what kind of behaviour you decide is potentially threatening and you'll get a captcha, or not.
I hope you can explain why do we need to use cloud flare.
If you're hosting your site on GitHub Pages you have no control over the server-side, ergo you can't set up SSL in any way yourself unless you proxy through someone. Cloudflare is one of the easiest ways to do so without having to set up any infrastructure yourself or paying anything for it and supports CNAME flattening at the apex/root of your domain so you get all the benefits of GitHub's CDN which doesn't happen when you have your domain return the two A-records for GitHub Pages.
Oh, no wonder so many sites uses cloudflare.
Luckily I didn't use cloudflare on my site.
I was thinking https://letsencrypt.org, but I still need to read up on that a bit in order to figure out exactly what needs to be done if we go the indie way.
But how do you intend to use letsencrypt? You have no control over the server that hosts the site right now. GitHub doesn't allow you to install custom certificates. So you'll either need to move off GitHub Pages for the hosting to your own thing, at which point letsencrypt is a possibility, or proxy through something like Cloudflare to have them handle SSL for you.
I see... The script is hosted on GitHub and the DNS forward is done by my domain provider. :confused:
@bucaran I think if you can just host if yourself, you can just do it. https://www.youtube.com/watch?v=ZXsQAXx_ao0
I think you can use HTTP/2, it can be encrypted too and is faster than https.
If you want to learn to distribute software securely, I would recommend starting by reading this:
https://www.debian.org/doc/manuals/securing-debian-howto/ch7#s-deb-pack-sign
I'm not saying you need an infrastructure like Debian's for your project, but if you study how Debian does it, you can apply the same principles to your own system. Basically, sign your releases with GPG.
Remember, HTTP, HTTPS, SSL, TLS, HTTP2 (which is brand new and doesn't replace anything yet!)--they are transport methods. They do not authenticate what is downloaded. They are like the postal service, and what you need is a signature and a notary.
Anyway, I tried.
@alphapapa Thanks for linking to that page. I have a couple of inquiries.
As you know, Fisherman is made only of shell scripts. There are no binaries or code that is compiled. Does that matter or is it irrelevant?
Basically, sign your releases with GPG.
So, does that mean I sign the .git
directory every time I push a new version? After reading the scheme for package signature checks in apt
in the page you provided, I am still having a hard time trying to figure out this out.
With apt
packages, first there is apt
. In the case of Fisherman, there is no apt
as it is not distributed via a package manager, but by installing a script that clones this repo using Git and then calls make
.
I'm not saying you need an infrastructure like Debian's for your project, but if you study how Debian does it, you can apply the same principles to your own system. Basically, sign your releases with GPG.
GPG is the biggest user experience cluster fuck to come out of the CS field since it was pioneered. Yes it is good at what it does. No it is not, in any way, easy to get started with or a pleasant experience for novices. If installing software requires you to interact with GPG in any way you're doing it wrong. There's specific tooling in Debian like apt-key
and add-apt-repository
precisely to avoid dealing with GPG entirely.
Also note that Debian does "require" you to sign the resulting package does very little to verify the authenticity of the original source code, that burden is placed on the maintainer. It's perfectly possible for malicious software to make it into the Debian archive signed by a key that's part of the Debian keyring. Though it won't be possible to tamper with the package once it makes it into the archive.
Something like The Update Framework is very well suited for distributing software updates securely without the pain of GPG and is what's currently used by a number of package managers like Hackage and is what's being explored by NodeJS, Rubygems and PyPi.
However, Fisherman shouldn't really concern itself with how people chose to install it. If people deem a curl | fish
to be good enough for them without verifying anything, that's on them. We could easily store those checksums though and fetch those over HTTPS and verify a number of other things before we initiate installation but at some point you're going to have to trust something/someone.
If we want to give people other options we should get packages into the Debian archive, EPEL, Homebrew etc (and disable the built-in update mechanism when installed from a package).
@bucaran Remember that you could always use sha256
in combination of gpg
to sign it, it might be more effective. Probably add a method to verify that the script is correct in the shell script.
As you know, Fisherman is made only of shell scripts. There are no binaries or code that is compiled. Does that matter or is it irrelevant?
I'm not sure how to answer your question succinctly. Theoretically it is irrelevant. In practice, attacking a compiled binary requires far more expertise.
But for your purposes, it should be irrelevant. What we're discussing here is a secure delivery method for your software. Once the software is on your users' systems, it's out of your control.
Basically, sign your releases with GPG.
So, does that mean I sign the .git directory every time I push a new version? After reading the scheme for package signature checks in apt in the page you provided, I am still having a hard time trying to figure out this out.
No, it would mean that you would do something like this:
$ cd ~/src
$ tar cvfz --exclude-vcs fisherman-20160215.tar.gz fisherman
$ gpg --clearsign fisherman-20160215.tar.gz
$ ls
fisherman
fisherman-20160215.tar.gz
fisherman-20160215.tar.gz.asc
Then you would upload the tarball and the .asc
signature file as a release. Users would then download the tarball, and if they wanted to verify the signature, download the signature file, import your private key, and verify the signature. Then they would extract the tarball and install your software.
You can also sign your git commits. man git-tag
for info.
There's specific tooling in Debian like apt-key and add-apt-repository precisely to avoid dealing with GPG entirely.
If you're building Debian packages, sure. If not, using GPG to sign and verify release tarballs is very simple.
Also note that Debian does "require" you to sign the resulting package does very little to verify the authenticity of the original source code, that burden is placed on the maintainer. It's perfectly possible for malicious software to make it into the Debian archive signed by a key that's part of the Debian keyring. Though it won't be possible to tamper with the package once it makes it into the archive.
Yes, that's why Debian has key-signing parties, where people gather and sign each other's GPG keys in-person, and why Debian has a procedure for granting new members developer and maintainer rights. It all comes down to trust in the developers and maintainers, that they will do their job properly and vet any software they upload.
The Update Framework sounds great, and that's why I mentioned it in the discussion on #90 last week.
However, Fisherman shouldn't really concern itself with how people chose to install it. If people deem a curl | fish to be good enough for them without verifying anything, that's on them.
I disagree. Fisherman should neither support nor encourage dangerous installation methods like that. It should only endorse and support secure methods (not that any method is perfectly secure).
If we want to give people other options we should get packages into the Debian archive, EPEL, Homebrew etc (and disable the built-in update mechanism when installed from a package).
That would be a great long-term goal, yes.
But, of course, this isn't my project. I'm just an outsider trying to encourage people to be security-conscious and avoid spreading dangerous ideas.
Lots to learn from you, thank you @alphapapa.
I am using GitHub to host the script and the install URL resolves to that script's raw URL which is always on GitHub. Does that mean an attacker would need to hack GitHub or my DNS provider to harm Fisherman?
And,
Users would then download the tarball, and if they wanted to verify the signature, download the signature file, import your private key, and verify the signature. Then they would extract the tarball and install your software.
IMO this is too complicated for the average user. I would download a script that does this for me, but then we are back to the start or not? How can we make this easier for users?
The Update Framework sounds great...
TBH if they want more people to hop on this they need to make these concepts easier to understand.
Does that mean an attacker would need to hack GitHub or my DNS provider to harm Fisherman?
Yes. Though if they would DNS spoof you they'd also need to present you with a certificate for Github.com that's valid (as in handed out by a trusted CA).
I disagree. Fisherman should neither support nor encourage dangerous installation methods like that. It should only endorse and support secure methods (not that any method is perfectly secure).
Then your proposed method of GPG isn't secure either because it's an opt-in. If this is to hold true it should not be possible to install Fisherman in any way, or use it, without having verified its authenticity. Which isn't solved by just signing a release with GPG.
If we're not to support any method of dangerous installation then we should even move the whole project off GitHub since a git clone
doesn't guarantee the authenticity either.
@daenney I thought git clone
does that? It checks for the files using sha1
if I recall correctly.
@bucaran
Lots to learn from you, thank you @alphapapa.
It's kind of you to say that, but again, I'm far from an expert. I'm just an enthusiastic hobbyist who's used Linux, Debian, etc. for a long time. And I try to stay relatively informed on security stuff in general.
I am using GitHub to host the script and the install URL resolves to that script's raw URL which is always on GitHub. Does that mean an attacker would need to hack GitHub or my DNS provider to harm Fisherman?
Basically, yes. Some attack vectors I can think of would be:
For methods 1 and 2, signing your releases and git tags with GPG protects against this. An attacker would have to compromise your personal system and sign their own code with your private key.
For methods 3 and 4, HTTPS/TLS would help mitigate, however there are potential attack vectors there as well:
a. If a Certificate Authority or a CA key were compromised (and if you've been paying attention, it seems likely that this a safe assumption), a certificate could be generated that would appear authentic to the browser.
b. Software vulnerabilities and bugs in server or client code could render HTTPS/TLS ineffective. For example, bugs have in the past made it possible to transparently disable encryption or downgrade to a more easily compromised encryption method. Or bugs could cause key disclosure (e.g. Heartbleed). Remember, all software contains bugs, and more bugs are always being discovered.
IMO this is too complicated for the average user. I would download a script that does this for me, but then we are back to the start or not? How can we make this easier for users?
Yes, you're right that doing that would bring us back to square one.
To be frank, most users are probably not going to go to the effort to verify the integrity of your software before installing it. Some people don't care; and by the same token, some people get infected with malware, viruses, worms, trojans, etc.
But just because some people won't be careful doesn't mean that you shouldn't make it possible and encourage them to. Of course, this is your decision. But when it's a simple matter of running a script that generates a tarball and then signs it with GPG, and all you have to do is run the script and type in your passphrase--why not do it?
Of course, this is only "phase one" of the process. After this, you still need to authenticate the plugins that fisherman downloads and installs, and that is going to be the more challenging part. But it's just as important, if not more so, especially since plugins are created by third parties.
TBH if they want more people to hop on this they need to make these concepts easier to understand.
I'm sure there's room for improvement, but to be honest, there's only so much that can be done. To design and implement a secure system requires understanding certain things, just like writing Fisherman requires knowing about Fish, shell scripting, networking, git, etc. And every software developer should have a basic understanding of these concepts.
@daenney
Yes. Though if they would DNS spoof you they'd also need to present you with a certificate for Github.com that's valid (as in handed out by a trusted CA).
Basically, yes. However, as I mentioned, HTTPS and the CA infrastructure is not a silver bullet, and it should not be relied upon to secure your software distribution.
Then your proposed method of GPG isn't secure either because it's an opt-in. If this is to hold true it should not be possible to install Fisherman in any way, or use it, without having verified its authenticity. Which isn't solved by just signing a release with GPG.
Sorry, but this doesn't make sense. You can't stop people from running your code if they want to. We're talking about free software shell scripts here.
All security is ultimately opt-in. The point here is not to force people to opt-in. The point is to give them something to opt-in to, and to encourage them to do so.
If we're not to support any method of dangerous installation...
By "not support" I mean "not encourage" and "not help people use." In other words, if people want to install the software unsafely, they're on their own.
then we should even move the whole project off GitHub since a git clone doesn't guarantee the authenticity either.
As I've said, signing git tags with GPG protects against this. man git tag
for info.
@pickfire
I thought git clone does that? It checks for the files using sha1 if I recall correctly.
git clone
merely takes whatever the server gives it. It does not verify the authenticity of anything. It could detect accidental corruption if hashes don't match, but it can't detect malicious attacks that replace good commits with ones containing bad code.
Git itself uses SHA hashes at a low level to identify everything, because in this way git is like a "content-addressable filesystem." If you want to learn more about how git works internally, this is probably the canonical source, although there are many other good guides available: https://git-scm.com/book/en/v2/Getting-Started-Git-Basics
@alphapapa I really like your article about security. You seems to be equipped with a lot of security knowledge than most of us here, we really hope that you could help securing fisherman. What do you think?
And what if fisherman.sh
is compromised? And is there any exceptions for what you just said?
@pickfire I'm not sure what article you mean. But as I mentioned in the other bug report, here is one that I think should be considered required reading for anyone trying to deal with this issue: http://www.cryptnet.net/fdp/crypto/strong_distro.html At least read Section 1 of the article, which gives an overview of the principles involved.
I'm glad to help if I can, but I think it's important to emphasize that this is far from a trivial problem. I understand the appeal of providing an all-in-one tool that downloads, installs, and updates shell script "plugins" independently of the underlying system, but doing this "right" practically requires implementing a secure software distribution infrastructure from scratch. And doing that is very difficult and error-prone even for actual experts!
Having said that, I think that a lot of improvements could be made by using signed git tags. This would provide at least theoretical out-of-band initial verification by retrieving PGP keys from a third-party (not perfect, but better than nothing), and subsequently should be fairly secure, as long as you use the same keys and the keys don't get compromised. I think this would probably be better than trying to design some kind of package- or tarball-based system.
But this still leaves the matter of unverified third-party plugins. Are you going to set up your own repository for them, and are you going to manually review, verify, and sign them all? That's a "human resources" problem, because it can't be automated.
It's just a very hard problem. Ultimately it can't all be automated. Probably the best you can do is secure your own software and give your users the tools to verify as much as they can.
And what if fisherman.sh is compromised? And is there any exceptions for what you just said?
Compromised where? On GitHub? Well, that's basically game-over, isn't it? That would mean that GitHub or your personal systems were compromised. The problem there is a scenario in which users install updates to Fisherman automatically, thereby automatically compromising their systems. Automatic updates are usually a bad idea, but this is especially so for unverified software.
On a user's system? That would be game-over for that user, because the attacker would then have full access to the system with that user's privileges. In that case, why would the attacker bother with your script? He could do whatever he wanted. That would be like someone breaking into a house through the front door--why would the intruder then mess with the window locks? He's already in.
Does anyone want to take a stab at this? I might someday, but I'd rather focus on tidying things up internally with Fisherman and plugins.
I found this discussion to be very interesting and it raises some valid questions. Even more interesting is that package repositories like npm
are not validated or verified. So basically you could just upload a malicious package with worm-like code that infects other packages along the way. Now that I write this, something rings in the back of my head about having read about exactly that scenario shortly before.
Tag signing will only work when all "official" plugins are hosted centrally and are therefore signed with an official Fisherman key (or possibly several trusted individuals), something I suspect is not exactly what this project is about. From what I gather, this is what others do and could lead to rate-limiting via GitHub.
Having each plugin creator sign their tags individually wouldn't solve the problem as you can just create GPG keys at will and sign whatever you want. The lingering danger of someone suddenly updating their plugin with possibly malicious code remains. Also, this would require a release-based workflow as only certain tagged releases would be deemed safe. The review workload for trusted individuals would be significant as any change to any plugin would have to go through official code review, then merged and signed before being available to the public. Again, this needs a central repository.
I'd be up for thinking this through more thoroughly and putting research into safe code delivery practices.
If you're interested, I could compile a list of the most important questions regarding user experience and manageability to work onward from there.
If you're interested, I could compile a list of the most important questions regarding user experience and manageability to work onward from there.
Absolutely, I'd love to see what you come up with.
@herrbischoff Thanks for that thoughtful comment. One question:
Having each plugin creator sign their tags individually wouldn't solve the problem as you can just create GPG keys at will and sign whatever you want. The lingering danger of someone suddenly updating their plugin with possibly malicious code remains.
What do you mean? If the Fisherman project had the keys for each plugin author on-file, then it could verify the signature of new plugin versions' git tags. If a different key was used, it would fail to verify. The attack vector then would be to compromise the plugin author's GPG key, which would require stealing his secret key and discovering his passphrase.
Signing packages however still doesn't really solve the problem. It might prevent things from being tampered with but just because a package is signed does not make it non-malicious. Donald Stufft wrote a pretty good blog post about it a while ago for similar work on PyPi.
https://caremad.io/2013/07/packaging-signing-not-holy-grail/
Typically any time the topic of security and software packages, in my case typically Python packages, comes up someone seems to come up with the “helpful” suggestion of “Just Use X!”, where X is typically GPG but can be any of a wide range of signing technologies. Quite often the people suggesting it have latched onto signing packages as some sort of voodoo you can throw at the problem and magically get “security”.
I suggest people read the rest of the article.
Of course it's not a panacea, but it's approximately one million times better than not signing packages at all. :) Preventing tampering is very important (cf. Linux Mint's ISO servers being compromised recently). And short of manually reviewing every change to every plugin, it's the best that can be done. Obviously Fisherman isn't going to review every change to every plugin. But a web of trust is better than nothing.
I don't think the suggested Web of Trust solves anything though. This trust thing in GPG you normally build up by signing keys of people you know. Yes there are things like KSP where random people sign other random people's keys based on poorly verifying, sometimes forged, documents. But that's hardly trust. Yet that's mostly how these WoT work.
For me to be able to release a plugin, which @bucaran would want other users to be able to install, he'll have to sign my key. But what does that say? Does he actually trust me? Does he know me? Does he know I adhere to decent development practices and trust that I won't release a plugin that contributes to the end of humanity? Or is it just an administrative hurdle that new contributors have to go through?
And what if he decides that "oh it's Monday today and now I don't like this person anymore" and revokes the signature on my key? All of a sudden my plugins can't install/update anymore even though there's nothing wrong with them, except for potentially a personal dispute with the person that holds the key to the kingdom.
It's not as simple as "something is better than nothing" and adding GPG doesn't necessarily make things safe or trusted. It risks creating the impression of safety even though the foundation, that trust, isn't really there which is far more dangerous as it lulls people into a false sense of security.
What I meant was the scenario @daenney describes (going to read the full article after finishing this post). Signing does not prevent any author from injecting malicious code into their plugin. It would make it a wee bit more involved though. Short of meeting in person, exchanging and signing each others keys and also signing some binding internationally valid agreement you can never be sure of the other person's intentions. And even then, agreements get broken all the time because people change.
A Web of Trust is built on signing each others keys only when trust has been established in the real world, through official documents, extensive personal relationships, whathaveyou. It only works when you adhere to these principles, which for anyone, except close friends and family, is next to impossible to establish. This is one of the reasons why we still don't have working email encryption for the masses. There either needs to be sufficient awareness about the significance of signing (unlikely to remain so, the bigger the group gets) or there needs to be a central authority. The central authority has comprehensive powers to allow and deny, which is a rather shaky idea in itself. There has be be absolute trust in the central authority. On an institutional level this is usually safeguarded by a board or panel with an uneven number of participants to prevent standoffs, power grabs and unethical behavior. All participants must have proven their eligibility to serve in such a function by years of dedicated service to other causes and must have something significant to lose should they ever break the professional oath they take after being sworn in.
I don't see how we can accomplish anything like that. Then again, you need to keep you sights level on what we are doing here. Repositories like npm
or PyPi
are fundamentally plagued by the same issues, yet still work reasonably well. That's because people usually want to do good and not evil. These software repositories are used to create production software, while we are managing shell functions. You should always inspect before you execute. You can not insure against being careless. This is a basic responsibility of any user: with great power comes great responsibility. In fact, many Unix flavors do remind the user about that whenever he first logs in to the root user or uses sudo
.
The most workable solution, on a meta level, appears to be to offer some kind of signing/verifying but making that optional. I'm not sure how this would square with the fish philosophy. Let me dwell and think on that a little more. In the meantime, I invite all and everyone to write down your input and thoughts and we can sort though them afterwards. I will also start to compile a list with the most important questions regarding the pillars of this construct.
Sorry if I wasn't clear before. What I mean is a system like this:
This is not technically a PGP-style web-of-trust, but it makes it much harder for malicious code to enter the system. The initial submission is verified by Fisherman, and subsequent releases can only be made with the plugin author's key. If a key is later used to sign a malicious version of a plugin, Fisherman can remove the key from its list, delete the plugin, and that key can no longer publish plugins.
It's similar to Debian's system, in which a new developer must have his identity verified and his key signed, and after that, anything signed with his key is trusted.
It's not technically difficult to implement, and it provides a significant security benefit.
@alphapapa If you want to join the organization and help us make this happen, just say the magic words.
Great discussion, but closing as I don't have strength to work on this at the moment. Again if anyone is feeling courageous, be my guest :smile:
https://letsencrypt.org/howitworks/