Open ara4n opened 5 years ago
Thanks for filing this. This situation has arisen for us a few times when using riot.im. Typically we are either at a conference or in an office meeting (we are mostly remote, so this is rare for us) and we want to set up a chat room to coordinate.
I did once go through the process of validating everyone's keys -- just for me -- and it was a huge hassle, so now we just set up non-encrypted rooms instead.
It would be nice when everyone is together if we could just tap phones together (so using NFC) and rely on the security of the office or conference for verifying that only genuine participants are there. Also to avoid O(n^2) phone tapping if we could each tap one person's phone and have that establish a "web of trust" (albeit one limited to only a single degree of separation).
I vaguely remember that cross-signing is actually considering this use case somehow, but am failing to find/remember how?
One thing that we were considering was that a user could upload a set of trusted keys that could be used by others for signing other users' keys. So, for example, a company could have a key that they use to sign everyone's keys, and then everyone could add the company's key as a trusted key. There's a lot of details that need to be figured out, though.
I am a counsellor, and the same issue plays in anonymous private groups (kind of AA idea). The participants don't know each other, so no direct verification is possible. I am the only one who knows the participants, and it would be great if they could just decide to trust my judgment in verifying.
A similar use case, I have multiple devices that I log in on. It's a bit onerous to step through the verification process for each pair of devices that myself and the other person use.
A similar use case, I have multiple devices that I log in on. It's a bit onerous to step through the verification process for each pair of devices that myself and the other person use.
That will be solved by cross-signing, which is currently being worked on.
I gave a lot of thought regarding that issue, and I think it would integrate nicely with communities when they are redone (#1772). I would add the option to set up a chain-of-trust per-community. If communities are rooms, everyone can sign others'keys and publish them in the community room. The trust graph can then be built from it.
The only possible downside I see from it is the social graph leak: if A verifies B, that means that A and B probably know each other, or are physically close. But it would work quite well for that "conference" setting: display a big QR code on the projector, maybe tied to a temporary signing identity that you then mark as trusted for that community.
Rooms that enable it could use a community's trust chain as an additional means of verifying the identities (one would still need to verify the root of trust/community, I guess). Communities could implement more complex key management actions, such as revocation (say if 2/3 of mods vote for revoking a key).
I did mention it a few times previously:
I could try to polish a bit more the idea and make a draft MSC if people are interested, as time permits.
Has there been any more work on this? Started to deploy Matrix with encrypted rooms for a large number of people and we've been finding the individual user trust verification process slow.
The idea that the owner of the room is the root of trust sits quite well with me. (Also perhaps having a way of visualising your own web of trust would be neat)
In long run I'd like to see some (easy) method of user importing "trust that someone else has established". It could be simple file to download and import file, if not more elaborate way of "I trust this instance and I wanna trust anyone this instance trusts too". It would need the ability of person distrust an individual on the list if one desires to, but I feel this is already too detailed thoughts on the general gist :)
Specific usecase for "me": I participate in running HS catering all Finnish hackerspaces and I would love to offer trust relation incorporating all of our users (that cares to take part) for said HS users, and ofcourse for anyone else that cares to trust the data hackfi provides. :)
Another thought on this: it might be useful to do more than a 2-hop trust chain in some very specific situations too.
2-hop chain would be Alice -> Admin -> Bob, where the trusted Admin has verified Alice while setting up her phone, meaning she will automagically trust Bob, who the Admin also verified when setting up his phone.
On the other hand, 3-hop trust chain could be useful for Alice -> Company A's Admin -> Company B's Admin -> Bob.
Yes - in such cases I'd still expect Alice to have to agree, as in: "A admin trusts B admin transitively. Do you trust B admin transitively on that account?" (Where "trust transitively" means "trust those trusted by".)
A more general approach would be to show trusted people, and provide the ability to set transitive trust for any of those (leading to more trusted people, and the possibility for recursion).
I just opened https://github.com/matrix-org/matrix-spec/issues/1778 which may be a simpler approach to solve a similar problem. The proposal isn't concrete yet but the current idea is that when linking to an account you can also include a fingerprint. This means that the "company directory" problem could possibly be handled by the directory webpage containing key fingerprints in the links. It also handles the "sharing a contact" problem as you can share a contact with a key that you have verified in the URL. The user can then automatically trust that key so that the contact is auto-verified.
It doesn't quite solve this exact problem, because in the company case you may want ongoing delegation of trust, whereas the https://github.com/matrix-org/matrix-spec/issues/1778 mostly focuses on one-time trust sharing. But it does handle many of the use cases.
https://news.ycombinator.com/item?id=19167585 points out that there may be legitimate times where you want to avoid O(N^2) verifications between a set of users - e.g. verifying people in a physically secure setting such as an office or conference.
For that matter, if you join a company, it'd be nice if you only had to verify your boss, and then transitively trust the rest of the company.
I vaguely remember that cross-signing is actually considering this use case somehow, but am failing to find/remember how?