Open kjetilk opened 5 years ago
Both user and resource associated Authorization Servers #43 could let users configure some kind of 'trust policies'. For example app that have specific certifications, published by specific entities etc. This would affect Consent Screen when user gives app authorizations. It would show if app meets or doesn't meet those polices. IMO even stronger case to have dedicated Authorization Servers which would take all the responsibilities related to authorizing apps, revoking authorizations #24 etc.
Yes, but that is still just the technical aspect. A random user would not have much to go on to formulate those policies. Dedicated Authorization Servers could serve as a centralization point. I think we need to be much more elaborate in involving the social fabric of the ecosystem, the humans, not the machines.
Hi. I agree that there are two forms of "trust" here - one related to security which relate to certificate, etc. and the other about humans which relate to how and why the app is using their data. For the second one, I am of the strong opinion that users should be able to express their own policies (or reuse from community) about how they want to let others use their data, and that such policies should be used to assist them in making decisions and reducing the ability of apps/companies to manipulate and take advantage of people.
For example, certain things that such (machine-readable) policies can help flag or provide additional contextual information when apps request access to make a better informed decision:
In order to enable this, both the app request and the user/community preference or guides need to be in machine-readable forms so that the agent can interpret and use them. Otherwise there is a strong likelihood to continue the current malpractices where users get a notice that only provides a link to a website T&C that they either don't read or don't fully understand, and end up giving access to do something with their data they had not anticipated or intended.
Having to read stupidly large "privacy policies" and "terms and conditions" without actually understanding anything
IMO we should collaborate with community-driven services like https://tosdr.org/ to address this specific problem.
Currently, at use.id we're using the OIDC Dynamic Client Registration metadata values policy_uri
(policies) and tos_uri
(terms of service) to provide users with links to those documents. Towards the future, we aim to implement something like the ODRL vocabulary, combined with DPV or gConsent (like described https://github.com/solid/authorization-panel/issues/55). This legal consent information could possibly be embedded within SAI Data Grants (like described here). There already exists a specification combining ODRL with DPV specifically for Solid. In the end, this should enable both the user and the application to express legal conditions that are also machine-readable, and which can therefore be displayed in a structure manner, and even programatically compared.
Hi. I have written up an article titled "Making Sense of Solid for Data Governance and GDPR" https://osf.io/m29hn/ that analyses how Solid in its current state relates to GDPR's requirements, what are some of the possible governance models (for Pods and Apps), and some issues that are known to be problematic also apply to Solid. The aim is to emphasise the necessity and importance of answering (through developments) the question this issue has raised. The article also explores some specific ideas for improving things (Section 8).
We're discussing a lot about the technical management of access control, but there's much more to it in the social space. I'm fortunate enough to have kids who trust me enough to ask me whether they can install a certain app on their mobiles and tablets. Although I'm not revealing that to them, the usual answer is "I have no idea". I have very little to go on in terms of deciding whether an app is trustworthy. It is just a bunch of heuristics, and funnily, the kids develop their own heuristics too.
I think this highlights a much bigger problem than the technical ones, we have to enable people to make much more informed decisions about access control on the open Web for Solid to be useful beyond tight, social groups. If not, Solid is likely to also just accommodate a small number of large players, or be a place where social engineering is rampant to extract private information.