shadow-maint / shadow

Upstream shadow tree
Other
295 stars 230 forks source link

bye github #684

Closed alejandro-colomar closed 1 year ago

alejandro-colomar commented 1 year ago

Hi all,

While github has always had a nice interface (especially for git blame; but projects like cgit are catching up on this), I'm very concerned about the decisions it has taken lately, especially since one year ago around the npm issues (if somebody doesn't know them, I can try to find links to the discussions).

I personally prefer sending patches through a mailing list, but acknowledge that it's nice to have automated CI for patch incoming patch sets. Hopefully, soon we'll be able to run all tests locally in a way that I can know that my patch sets are correct before sending.

Are there alternatives as good as github for this? I don't know much about gitlab. While I used it in the past in a job, I don't know if they host free runners for open-source projects, as github does.

This also reminds me that even if shadow is not migrated to gitlab or somewhere else, we should strongly consider not being locked in to github's CI, so the tests and build/CI system should be as generic as possible in that regard, in case we change our minds later.

I'd like to be able to continue contributing to the project, and at the same time I'd like to leave github :).

Cheers, Alex

hallyn commented 1 year ago

I have in the past lobbied for moving to sr.ht. It has a good build infrastructure, and patches through mailing lists.

The main concern has been that something like https://git.sr.ht/~hallyn/shadow is too tied to one person.

alejandro-colomar commented 1 year ago

I have in the past lobbied for moving to sr.ht. It has a good build infrastructure, and patches through mailing lists.

+1

The main concern has been that something like https://git.sr.ht/~hallyn/shadow is too tied to one person.

The Linux man-pages project lived for more than a year at http://www.alejandro-colomar.es/src/alx/linux/man-pages/man-pages.git until I got access to https://git.kernel.org. I think it was good as a remainder that free software is decentralized, and lives in the forks of the maintainers and contributors.

We could have several "official" repositories (so listing them all as official repositories in the README), one for each of you maintainers, and have one of them be the "current" main tree (to which the others sync their default branch), but the others could take over at any moment if the main fork dies. Kind of a Docker Swarm of repos :).

FWIW, my fork of shadow lives here: http://www.alejandro-colomar.es/src/alx/shadow/shadow.git/

rbalint commented 1 year ago

Hi all,

While github has always had a nice interface (especially for git blame; but projects like cgit are catching up on this), I'm very concerned about the decisions it has taken lately, especially since one year ago around the npm issues (if somebody doesn't know them, I can try to find links to the discussions).

Please detail the concerns. I have not noticed anything disturbing from this project's POV lately.

I personally prefer sending patches through a mailing list, but acknowledge that it's nice to have automated CI for patch incoming patch sets. Hopefully, soon we'll be able to run all tests locally in a way that I can know that my patch sets are correct before sending.

I hate receiving more complex patches via mailing lists, because it is not clear if they would pass CI and the conversations are not presented as nicely as on GitHub/GitLab/etc. issues.

For running GitHub Actions locally there exists https://github.com/nektos/act , which I have not tried but I assume it would work.

Are there alternatives as good as github for this? I don't know much about gitlab. While I used it in the past in a job, I don't know if they host free runners for open-source projects, as github does.

IMO GitHub is the best in terms of functionality thanks to the Actions and Apps Marketplace. It is also the most widely known and used. GitLab also hosts open source projects, but I haven't seen a compelling reason to switch to it.

This also reminds me that even if shadow is not migrated to gitlab or somewhere else, we should strongly consider not being locked in to github's CI, so the tests and build/CI system should be as generic as possible in that regard, in case we change our minds later.

I don't think putting a lot of effort to not use GitHub's features is time well spent. On the contrary, taking advantage of what's available helps maintaining the project with the least effort and with high quality.

I'd like to be able to continue contributing to the project, and at the same time I'd like to leave github :).

So far it seems to be a personal preference. Maybe there is a compelling reason for the migration, but I have not seen it presented so far. Many decisions taken by the shadow project is documented in GitHub issues and pull request and they practically can't be migrated. A potential migration would force remaining and new contributors to use two git hosting providers to dig into the history behind the code changes which is significantly less pleasant, than the status quo, that can be kept with the least effort.

It is still not clear why you would impose such an unpleasant change on the project contributors and the wider audience.

alejandro-colomar commented 1 year ago

Hi all, While github has always had a nice interface (especially for git blame; but projects like cgit are catching up on this), I'm very concerned about the decisions it has taken lately, especially since one year ago around the npm issues (if somebody doesn't know them, I can try to find links to the discussions).

Please detail the concerns. I have not noticed anything disturbing from this project's POV lately.

https://github.com/github/site-policy/issues/513

It doesn't affect this project. But I don't trust github and their policies.

Now they're enforcing 2FA. I think 2FA can be a good thing, and allowing it is good. But enforcing it? No, thanks.

What will they require in the future? It's MS behind that, so who knows.

The real issue that GH is trying to address with 2FA is trusting GH as a source of truth for anything. If it's not a source of truth, then if somebody hacks a GH account, then what can they do? Modify source? That's why you can issue GPG-signed tags: downstream can trust that commit. My own git repo doesn't have HTTPS for a reason: the only way I expect somebody to trust something is really written by me is my GPG key. Anything else is not to be trusted.

GH wants to position itself as the source of truth for free software. And that affects every project, as users will ask you to be on that site. So while there's nothing directly distrubing to this project, there's something to be concerned about.

And certainly I don't want to carry on my phone with me just to be able to contribute to open source. Indeed, writing this reminds me of the Debian tests for software:

https://wiki.debian.org/DissidentTest

I believe GH's policy breaks spiritually the dissident test, because it forces contributors to carry on their phone with them, which can be used to track them.

I personally prefer sending patches through a mailing list, but acknowledge that it's nice to have automated CI for patch incoming patch sets. Hopefully, soon we'll be able to run all tests locally in a way that I can know that my patch sets are correct before sending.

I hate receiving more complex patches via mailing lists, because it is not clear if they would pass CI and the conversations are not presented as nicely as on GitHub/GitLab/etc. issues.

That's why I want the CI to be able to run locally. It's then trivial to test them, without depending on the infrastructure where the code is hosted.

For running GitHub Actions locally there exists https://github.com/nektos/act , which I have not tried but I assume it would work.

I prefer something like:

... configure commands ...
$ make
$ make check

which I can run locally. Then the CI should just run those same commands.

Are there alternatives as good as github for this? I don't know much about gitlab. While I used it in the past in a job, I don't know if they host free runners for open-source projects, as github does.

IMO GitHub is the best in terms of functionality thanks to the Actions and Apps Marketplace. It is also the most widely known and used. GitLab also hosts open source projects, but I haven't seen a compelling reason to switch to it.

This also reminds me that even if shadow is not migrated to gitlab or somewhere else, we should strongly consider not being locked in to github's CI, so the tests and build/CI system should be as generic as possible in that regard, in case we change our minds later.

I don't think putting a lot of effort to not use GitHub's features is time well spent. On the contrary, taking advantage of what's available helps maintaining the project with the least effort and with high quality.

I'd like to be able to continue contributing to the project, and at the same time I'd like to leave github :).

So far it seems to be a personal preference.

Maybe there is a compelling reason for the migration, but I have not seen it presented so far. Many decisions taken by the shadow project is documented in GitHub issues and pull request and they practically can't be migrated. A potential migration would force remaining and new contributors to use two git hosting providers to dig into the history behind the code changes which is significantly less pleasant, than the status quo, that can be kept with the least effort.

I was taught to write commit messages that cover the important parts of such discussions, so that you don't depend on a mailing list or bug tracker. Only the less relevant details should be kept out of a commit message, and usually emailing one of the people mentioned in the commit message tags helps getting those details if necessary.

It is still not clear why you would impose such an unpleasant change on the project contributors and the wider audience.

I'm not a maintainer, but rather just a contributor (even though I authored about half of the commits in the last 1.5 years), so I leave the decision up to the maintainers. I'm just expressing my personal preference :).

jubalh commented 1 year ago

I believe GH's policy breaks spiritually the dissident test, because it forces contributors to carry on their phone with them, which can be used to track them.

Phones aren't the only way to use 2FA. You can use a Yubikey for example

alejandro-colomar commented 1 year ago

I believe GH's policy breaks spiritually the dissident test, because it forces contributors to carry on their phone with them, which can be used to track them.

Phones aren't the only way to use 2FA. You can use a Yubikey for example

I don't trust yubikeys either. If there's ever a free-hardware (not just free-software) thing for 2FA, I might like it a bit more. And even then, I don't understand why something like GH should need 2FA. Signing tags with a GPG password should be enough to add trust to the important things.

Just compare how easy it is to steal a physical 2FA device to how difficult it is to steal someone's GPG private key (and the password needed to use it). That's the real MFA. You need to get my GH password and my GPG key and my GPG key's password to be able to trick someone to believe I signed some tag (and if they got my GPG key by stealing my physical device, also get the (different) password to decrypt the drive :p).

vapier commented 1 year ago

there's a diff between coverage that a CI offers and what standard make check is for. CI systems should be much more strict (e.g. style) than what a local user cares about. people want to make sure the package works before they install it, not that the source is indented with tabs instead of spaces. there are also online-only services for code analysis that are trivial to leverage in a CI that are not that useful locally.

while we should use signed tags (and even signed commits), they are not sufficient protection, and we shouldn't throw out layers of security just because you fully trust pgp. plenty of people don't verify signatures, they just trust what's on the site. how will they even know that the signature is using a trusted key and not some random person's key ? the web of trust has plenty of problems itself in bootstrapping trust and even basic UX. blaming the user for bad project opsec is just trying to cover up for laziness. there's no excuse.

your personal issues when it comes to 2FA solutions are that -- your issues/choices. you're welcome to find alternatives that suit you. nowhere does GH say only phones are supported. in fact their docs are pretty clear in the requirements. you don't even need to use the web interface if you don't want to -- afaik, everything is available via GH's API, and they already offer a CLI interface.

as for your MS conspiracy theories, yawn. if MS decided to change GH today to charge everyone $1000/day in order to access anything, the git tree & history is still distributed and people will have copies to move it somewhere else. but that's not going to happen, so trying to shake that boogie man is a waste of everyone's time, and frankly just makes me want to write off anything else you've written in the same post(s).

as for your suggested alternatives, shadow used them, and migrated away. cgit/gitweb/etc... are just online git browsers. mailing lists are just mailing lists. shadow had a tracker (via Debian's old site), but no one used it because the hosting in general was uncommon. none of the services were integrated. it's great that your personal workflow is fine with bare bones, but it didn't work for shadow.

gitlab might be the only reasonable alternative in terms of feature set & free CI & ease of contributions. it's certainly not as big as GH, but maybe it's big enough for the people who use/contribute to the shadow project.

ikerexxe commented 1 year ago

I'd also prefer to continue using Github, or at least some other "centralized" hosting solution like Gitlab. From a personal point of view it helps me to understand the current issue list, the pending patches to review and it provides a handy CI in a single place. I'd like to stress the last point (CI). As a maintainer it's very important, as I don't have to trust the word of a contributor that they run it for the latest changes and then, it happens that they forgot about it and I need to do some nasty thing to the git tree. It already happened in my experience and I'd like to avoid it.

From a community perspective, I see that the development world is moving towards this type of solutions, where all the important pieces of a software package reside in a single place. All the stakeholders use this central place to communicate, raise issues, provide patches, etc. If shadow uses one of the most common places, then it will be easier for other people to contribute.

PS: Regarding 2FA, you can always use solokeys solutions. Either the hardware token or their software solution. If that isn't good enough for you, then I'd recommend you to check other providers, I'm pretty sure there must be something.

alejandro-colomar commented 1 year ago

Hi Mike,

On 3/22/23 04:45, Mike Frysinger wrote:

there's a diff between coverage that a CI offers and what standard make check is for. CI systems should be much more strict (e.g. style) than what a local user cares about.

I disagree. Some local users may want to just check that the package works well. Others may want to check that their patch conforms to the style that the project uses (e.g., I do).

In the Linux man-pages, I implemented that as make lint, which I think would be good for most other projects, as a base for the automated CI.

Since you've contributed to the man-pages, I guess you know about its existence, but if not, I suggest that you fetch the latest changes and try it yourself. For anyone reading this, I suggest reading the Linux man-pages' CONTRIBUTING file about this feature: https://git.kernel.org/pub/scm/docs/man-pages/man-pages.git/tree/CONTRIBUTING#n132 You can also run make help to check what linters are available (among other targets), and make help-variables for variables with which you can tweak the build system. Then try make lint, or make lint V=1 if you want to understand what being run.

people want to make sure the package works before they install it, not that the source is indented with tabs instead of spaces.

That's true for people that want to install the package. People that want to contribute to the package are (or should be) interested in following the style wanted by the maintainers.

there are also online-only services for code analysis that are trivial to leverage in a CI that are not that useful locally.

That's true. But I'd like to do locally as much as we can.

while we should use signed tags (and even signed commits), they are not sufficient protection, and we shouldn't throw out layers of security just because you fully trust pgp. plenty of people don't verify signatures, they just trust what's on the site. how will they even know that the signature is using a trusted key and not some random person's key ?

TOFU. Check that all (or a reasonable set) previous signatures by a given person have always been done with the same signature, going back to years ago. There's really not much more you can do than that. How do you know I'm Alejandro Colomar? You never saw my ID card. But my real name is not relevant. What is relevant is that you know I'm the same person that maintains the Linux man-pages (because the emails I post on that list are signed with the same key as this email, and because the tags I sign are also performed with the same one).

the web of trust has plenty of problems itself in bootstrapping trust and even basic UX. blaming the user for bad project opsec is just trying to cover up for laziness. there's no excuse.

Enforcing MFA is blaming the user for others that reuse their passwords, or for servers that allow passwords to be leaked. MFA does not provide security against a targeted attack (because if someone targets you, it's likely that they'll be able to steak your physical devices). MFA is a protection against generic attacks, but then another protection to that is using unique strong passwords, and not trusting a site more than it deserves. There's no excuse if anyone tries to assign a site more trust than it deserves and then gets tricked because someone hacked the site.

your personal issues when it comes to 2FA solutions are that -- your issues/choices. you're welcome to find alternatives that suit you. nowhere does GH say only phones are supported. in fact their docs are pretty clear in the requirements.

None really suits me; but as I explained, even if I didn't feel uncomfortable using MFA for other things where it might be more appropriate, I don't think it's good acting as if GH (or any other site) deserved more trust than it really deserves. The trust for any git repository should be handled by git, and git handles it with PGP signatures, which I believe to be strong enough.

you don't even need to use the web interface if you don't want to -- afaik, everything is available via GH's API, and they already offer a CLI interface.

Is the CLI not affected by this policy? They didn't document that. In fact, they were very blurry as to what will be the consequences of not using MFA. It says your actions will be limited, but not by what extent. If the CLI is not affected by this policy, it may be good enough for me. However, it makes the point even more funny, because effectively MFA will not be enforced to post on GH, which was the point at all for this policy change, wasn't it?

as for your MS conspiracy theories, yawn. if MS decided to change GH today to charge everyone $1000/day in order to access anything, the git tree & history is still distributed and people will have copies to move it somewhere else.

I'm not saying migrating the code will be difficult. Even Linux had to migrate it when their site got hacked, and there was no issue because all of the clones. The issue is that when/if that happens, all CI will be lost if it's on GH, which will make applying patches a problem for some time. If it's something local (or mostly local), then that issue is avoided.

but that's not going to happen, so trying to shake that boogie man is a waste of everyone's time, and frankly just makes me want to write off anything else you've written in the same post(s).

as for your suggested alternatives, shadow used them, and migrated away. cgit/gitweb/etc... are just online git browsers. mailing lists are just mailing lists. shadow had a tracker (via Debian's old site), but no one used it because the hosting in general was uncommon. none of the services were integrated. it's great that your personal workflow is fine with bare bones, but it didn't work for shadow.

Okay; I'm just pointing out that I'm not happy about this, and would like to know what alternatives are available. Serge also expressed to me that he's not 100% content with GH's interface, so maybe we can work on finding some workflow that suits everyone.

gitlab might be the only reasonable alternative in terms of feature set & free CI & ease of contributions. it's certainly not as big as GH, but maybe it's big enough for the people who use/contribute to the shadow project.

Serge also proposed sr.ht (I never used it, so can't comment). I'm not even saying that shadow should move out of GH fast. I'm just saying that I may be moving out of GH, and maybe it would be good if I could continue to contribute to shadow, thanks to an improved build system (and/or a different hosting site).

If there's some work towards making the CI more local (I'm having a look at that), then maybe I could send patches via email, and if some maintainer wants to open a PR with them, that's not my concern.

Cheers,

Alex

-- http://www.alejandro-colomar.es/ GPG key fingerprint: A9348594CE31283A826FBDD8D57633D441E25BB5

hallyn commented 1 year ago

Hi @alejandro-colomar ,

I appreciate your raising your concerns.

We have plenty of offline backups and forks of the tree so that I'm not very worried about "losing access". So from that point of view, we could just wait and see, and let it be known that at the drop of a hat we will switch away if need be.

What could push me to moving right now would be one of the following two:

  1. contributors are worried about, and object to, AI being trained on their contributions. I think this is valid. When someone raises this, I don't quite know what I'll do. For any code already contributed, however, the cat is out of the bag. Furthermore, even if we move this repo fully to another hoster, people can and will make forks at github. Until we have some legal means of stopping that, it's nothing but a symbolic move.
  2. contributors want to post patches but cannot because they're unwilling to have an account on github. This again I think is valid. If someone - who does not already have a github account - raises this, then I believe the lowest-friction answer will be to request a vger.kernel.org mailing list for shadow.
alejandro-colomar commented 1 year ago

Hi Serge,

On 3/22/23 14:31, Serge Hallyn wrote:

Hi @alejandro-colomar ,

I appreciate your raising your concerns.

We have plenty of offline backups and forks of the tree so that I'm not very worried about "losing access". So from that point of view, we could just wait and see, and let it be known that at the drop of a hat we will switch away if need be.

What could push me to moving right now would be one of the following two:

  1. contributors are worried about, and object to, AI being trained on their contributions. I think this is valid. When someone raises this, I don't quite know what I'll do. For any code already contributed, however, the cat is out of the bag. Furthermore, even if we move this repo fully to another hoster, people can and will make forks at github. Until we have some legal means of stopping that, it's nothing but a symbolic move.

Oh yeah, I forgot about this one. I had it in mind last week though. I'm not happy about AI either.

  1. contributors want to post patches but cannot because they're unwilling to have an account on github.

I am unwilling to have it. In fact, around a year ago, I was going to remove it, and there were only a couple of projects to which I contributed that required me to continue using it, one of them being shadow.

If shadow moves out of github, my account will soon be closed, or kept frozen in case I need it very sporadically.

This again I think is valid. If someone - who does not already have a github account - raises this, then I believe the lowest-friction answer will be to request a vger.kernel.org mailing list for shadow.

I'd love that.

Thanks, Alex

-- http://www.alejandro-colomar.es/ GPG key fingerprint: A9348594CE31283A826FBDD8D57633D441E25BB5

jubalh commented 1 year ago

contributors are worried about, and object to, AI being trained on their contributions

I'm no lawyer or anything. But I can't they just pull everything from GitLab (etc) as well and train it on that? It probably needs a special license to stop that? And we have plenty of cases of GPL violations in software. With often basically no consequences, right?

So even though I understand this argument. I'm not sure it's worth the downsides of moving away from GH.

contributors want to post patches but cannot because they're unwilling to have an account on github

I do believe that this is a small amount of people though. And we already have a discussion mailing list. We could use it, or create a new dedicated list, to give the option to send patches there. One of the shadow maintainers will then merge it to his local tree and push it here. Or use a PR to have CI.

It is hard to satisfy everybody. Plenty of people will not be have about sourcehut, cgit, gitea, GitLab...

If we move to GitLab (similar feature set like GH) some people will complain that they will have to create a new account (you can log in via you GH account on the official instance though). Or that they need to use another site and don't have everything in one place. I believe that having a project on GitHub makes it much more accessible to a fast number of contributors.

If we move to a mailing list based workflow I'm certain that some users don't want to invest the time to learn that workflow and thus not contribute their patches at all as well.

What I have heard from other projects where the same concern was brought up was that they stayed on GH and added the option to send patches to mailing lists, and the maintainers merged those patches. Only downside is slightly more work for the maintainers ;)

alejandro-colomar commented 1 year ago

contributors are worried about, and object to, AI being trained on their contributions

I'm no lawyer or anything. But I can't they just pull everything from GitLab (etc) as well and train it on that?

Well, they can pull the entire www :) However, that would probably be a license violation. If it's on github, it's more likely that they'll do the violation, and I wouldn't consider crazy if the conditions for using GH in the future changed to agreeing to let their AI to be trained on public repos. If it's out of GH, they have it more difficult.

It probably needs a special license to stop that? And we have plenty of cases of GPL violations in software. With often basically no consequences, right?

So even though I understand this argument. I'm not sure it's worth the downsides of moving away from GH.

contributors want to post patches but cannot because they're unwilling to have an account on github

I do believe that this is a small amount of people though. And we already have a discussion mailing list.

The problem with that one is that it's subscriptors-only. I prefer a public one. But could live with it (or maybe it can be opened now and see if spam is still an issue).

We could use it, or create a new dedicated list, to give the option to send patches there. One of the shadow maintainers will then merge it to his local tree and push it here. Or use a PR to have CI.

It is hard to satisfy everybody. Plenty of people will not be have about sourcehut, cgit, gitea, GitLab...

If we move to GitLab (similar feature set like GH) some people will complain that they will have to create a new account (you can log in via you GH account on the official instance though). Or that they need to use another site and don't have everything in one place. I believe that having a project on GitHub makes it much more accessible to a fast number of contributors.

If we move to a mailing list based workflow I'm certain that some users don't want to invest the time to learn that workflow and thus not contribute their patches at all as well.

What I have heard from other projects where the same concern was brought up was that they stayed on GH and added the option to send patches to mailing lists, and the maintainers merged those patches. Only downside is slightly more work for the maintainers ;)

Maintainers forwarding my patches from email to whatever infrastructure they want to use for their maintenance convenience is something I could agree with. It's up to them to commit to that work :)

rbalint commented 1 year ago

Can we conclude that the project stays on GitHub and contributors are free to convert patches received by email to PRs, like before?

vapier commented 1 year ago

On 3/22/23 04:45, Mike Frysinger wrote: there's a diff between coverage that a CI offers and what standard make check is for. CI systems should be much more strict (e.g. style) than what a local user cares about.

I disagree.

it doesn't seem that you do. i said there's a difference between running unittests wrt functionality and linting wrt style. you then provided an example that was exactly that. CI is about doing both. make check is not. i did not say there isn't value in being able to replicate CI behavior locally when reasonable, such as having extended "run the linters" targets.

while we should use signed tags (and even signed commits), they are not sufficient protection, and we shouldn't throw out layers of security just because you fully trust pgp. plenty of people don't verify signatures, they just trust what's on the site. how will they even know that the signature is using a trusted key and not some random person's key ?

TOFU. Check that all (or a reasonable set) previous signatures by a given person have always been done with the same signature, going back to years ago.

no one is doing this. hoping everyone (anyone?) is isn't being realistic. even they were, you're assuming people have access to the unmodified history, or that such a history exists. you're basically proposing that only people who have been around for years are to be trusted, and everyone new can gtfo.

the web of trust has plenty of problems itself in bootstrapping trust and even basic UX. blaming the user for bad project opsec is just trying to cover up for laziness. there's no excuse.

Enforcing MFA is blaming the user for others that reuse their passwords, or for servers that allow passwords to be leaked.

MFA is being realistic. password reuse is a proven problem, over & over. this includes among IT people. standing on a soapbox and telling people that password reuse is bad isn't going to change anything. the only thing it accomplishes is a more insecure world for everyone.

MFA does not provide security against a targeted attack (because if someone targets you, it's likely that they'll be able to steak your physical devices). MFA is a protection against generic attacks, but then another protection to that is using unique strong passwords, and not trusting a site more than it deserves. There's no excuse if anyone tries to assign a site more trust than it deserves and then gets tricked because someone hacked the site.

i think you think digital-only/remote-only attacks aren't useful, and if people wanted to actually attack someone, they'd show up the real world and hit them with a crowbar. there are plenty of examples in the real world that shows the exact opposite. nation states attack dissidents & journalists no matter where they are, and do so while trying to minimize fingerprints that trace back to them. it's pretty easy to disavow electronic attacks (and contract that work out). it's a lot harder to disavow physical attacks.

alejandro-colomar commented 1 year ago

On 3/22/23 04:45, Mike Frysinger wrote: there's a diff between coverage that a CI offers and what standard make check is for. CI systems should be much more strict (e.g. style) than what a local user cares about.

I disagree.

it doesn't seem that you do. i said there's a difference between running unittests wrt functionality and linting wrt style. you then provided an example that was exactly that. CI is about doing both. make check is not. i did not say there isn't value in being able to replicate CI behavior locally when reasonable, such as having extended "run the linters" targets.

Agreed :)

TOFU. Check that all (or a reasonable set) previous signatures by a given person have always been done with the same signature, going back to years ago.

no one is doing this. hoping everyone (anyone?) is isn't being realistic. even they were, you're assuming people have access to the unmodified history, or that such a history exists.

Yeah, I guess it's not standard practice. I do, however. For people I've interacted with in mailing lists, I check their emails to the mailing lists as back as I have in my mailbox, sampling some from random points in history, and if all looks good, I can trust that "I've been interacting with the same person all that time, and it has a corresponding fingerprint for future interactions", and nothing else (but there's really nothing else that can be proven, without an ID card).

you're basically proposing that only people who have been around for years are to be trusted, and everyone new can gtfo.

Not really; I'm rather new. But I don't expect you to trust me since day 0. I just do my things, and if after some time you consider me worth of some (of course, limited) trust, you can go back to my interactions with you and public mailing lists, and check if anything looks weird, and if not, you have a fingerprint that you can at least assign to me.

the web of trust has plenty of problems itself in bootstrapping trust and even basic UX. blaming the user for bad project opsec is just trying to cover up for laziness. there's no excuse.

Enforcing MFA is blaming the user for others that reuse their passwords, or for servers that allow passwords to be leaked.

MFA is being realistic. password reuse is a proven problem, over & over. this includes among IT people.

Agree. And I'll go further and recognize that I do that myself. I'm good at memorizing strong passwords, but if I'm required to change an interactive one every couple months, and include symbols in them, you bet I'm not going to remember a 64-char new one at every change, but rather append a &N to a strong password (that is unique to that login). And I also recognize I do use MFA for those ones (and also for a few other cases).

standing on a soapbox and telling people that password reuse is bad isn't going to change anything. the only thing it accomplishes is a more insecure world for everyone.

MFA does not provide security against a targeted attack (because if someone targets you, it's likely that they'll be able to steak your physical devices). MFA is a protection against generic attacks, but then another protection to that is using unique strong passwords, and not trusting a site more than it deserves. There's no excuse if anyone tries to assign a site more trust than it deserves and then gets tricked because someone hacked the site.

i think you think digital-only/remote-only attacks aren't useful, and if people wanted to actually attack someone, they'd show up the real world and hit them with a crowbar. there are plenty of examples in the real world that shows the exact opposite. nation states attack dissidents & journalists no matter where they are, and do so while trying to minimize fingerprints that trace back to them. it's pretty easy to disavow electronic attacks (and contract that work out). it's a lot harder to disavow physical attacks.

Those are valid cases, and I never lobbied for disallowing MFA. I rather not want it enforced at every other site I visit, so that I can value for which cases I consider it necessary. Also, maybe I'm missing something now, and in the future I recognize that MFA is really great everywhere, and start using it, but enforcing it is probably not the best way to make me like it. I'd like to have a choice for now.

jubalh commented 1 year ago

I believe this can be closed.