Open richvdh opened 4 years ago
For Synapse, this is how I see it. There are two scenarios:
openid
scope, Synapse can query the userinfo endpoint with the token. It can't check for scopes there, but at least it can check for token validity.Delegating the auth to a third party authorization server would make it a lot easier to implement policies and auth methods (2FA, FIDO, passwordless logins) specific to a deployment without having to implement them in Synapse. A lot of those identity providers have years of development and security hardening that would benefit many deployments.
Another thing to consider is to keep a lot of the existing login logic while slowly implementing most of OAuth2. As an access token is bound to a specific client, existing clients using the existing login mechanisms would be identified as a special "legacy" client that issues token without expiration (because current access tokens don't have expiration nor any mechanism to renew them). In this case, it would be even possible for Synapse to issue tokens from another AS using the password grant (if supported by the AS)
Last but not least, it would be interesting to see what it means for integrations. I've seen in Synapse's code that it implements a basic OIDC userinfo endpoint because of integrations ; there is probably something to investigate there.
oauth2 could be a uia stage, but uia allows greater flexibility.
At work we use it for additional verification steps to only allow login to "good" servers from our app
and as ever, particularly when considering security-sensitive subjects, it's better to use standardised protocols (even when they are as sprawling as OAuth2) than to roll your own.
That's not quite right - the good practice is to use battle-tested protocols. Unfortunately, that set does not fully overlap with "standardized protocols", and in fact quite a few standardized protocols designed by non-cryptographers are infamous for bad cryptography and insecure API design (see eg. the history of weird and broken crypto in WEP/WPA).
Notably, JWT is terrible from a security perspective and likewise OAuth2 has been frequently criticized by people in the security community for having a high degree of complexity, resulting in a risky attack surface.
I honestly don't think it's a good idea to use JWT or OAuth2 for usecases other than those where they are necessary for interoperability reasons.
Quick update: this project is progressing, as we get time.
I've been working on this area of Matrix for a little while now and on revisiting this issue I wanted to share what I believe are the main reasons for considering this change.
One thing to note is that I don't see change as being about replacing UIA with OIDC, but instead allowing homeservers to delegate auth duties to another entity.
By adopting such an architecture I see that there can be benefits to the ecosystem on multiple fronts -
It allows us to benefit from security protocols that have been battle-tested by many more people than just in the Matrix ecosystem.
It reduces the scope of the Client-Server spec freeing up spec time to focus on it's USPs (e.g. decentralised messaging).
From a homeserver implementor point of view it reduces the codebase which should:
From a Matrix client implementor point of view:
Over the years, we've evolved our own authentication/authorization system in the shape of the login api, user-interactive auth, access_tokens, and soft logout.
The current implementation is largely functional, but it does duplicate much of what OAuth2 does, and as ever, particularly when considering security-sensitive subjects, it's better to use standardised protocols (even when they are as sprawling as OAuth2) than to roll your own.
In addition, there are a bunch of things which we could improve in the current system, but which OAuth2 already has answers for.
I've started a document at https://docs.google.com/document/d/1e-ZqghugaHaRukb88PRmPFdm24-y-Ea5ka89IrAEZ7s to collect notes on what might be better with OAuth2, and how it might work in practice.