matrix-org / matrix-spec

The Matrix protocol specification
Apache License 2.0
171 stars 91 forks source link

Controlled accounts #797

Open twouters opened 3 years ago

twouters commented 3 years ago

Hi,

I've been looking for a way for my kid to communicate with their classmates/friends in a responsible way (both privacy and security) but my pursuit ended with the sad conclusion that there currently is no such thing.

I'm not going to introduce my kids to the pests like FB and others. Or rather, I would at least like to keep them away from those platforms for as long as I can since I realize that the social pressure to use them will only grow as kids age.

So before going into much more detail: this idea is targeting a set of features to support communication for children younger than 13 years old. This is a very sensitive subject so if the age target is out of the question due to law restrictions or whatnot, feel free to say so and there's no need to continue reading my ramblings.

So the idea is to create a walled garden for younger children to freely communicate without their parents having to actively setup a communication line using the currently existing platforms (which are all targeted towards adults or >13y) each time that the kid wants to talk (chat/video call/doodle/whatever) with one or more friends, classmates, family members, ...

Child accounts should be restricted in many ways:

There are probably other things that I can't come up with right now, so I might add some later on.

This featureset could also be used to provide safe spaces for elementary schools (class rooms, playground rooms, club rooms), sports clubs and other organizations revolving about children where you want your children to communicate with their peers without fearing harassment by external parties.

SimonBrandner commented 3 years ago

Note: This comment takes a bit of a worst-case scenario view on this but I feel like that should be a part of this (but it shouldn't be the main part).

In general, I am not in favour of child accounts and similar (at least partially). That is mainly because I feel like this can be abused. If I were an LGBT kid in a homophobic family this can be very unhelpful, the same can be applied in other cases. But I also understand the need for parents to have some kind of control (probably not the right word here) over what their children are doing.

I realize that there is much more nuance to this but I feel like the consequence should be considered - these should be entirely configurable, either by the client or by the user (in this case parent).

all parent accounts should be able to access end-to-end encrypted communication (this might be a bit tricky because if all parent accounts are automatically joined in every room, the children will get the feeling of being continuously watched which is not what I want to achieve)

This is more of a client-side issue but I feel like I'd rather be aware of the fact that my parent can read the messages than not know.

all child account actions are constrained to non-public spaces. (not sure if that's a thing, I should read up on the current spaces spec)

Rooms and spaces can sometimes serve other purposes than messaging (e.g. announcements) - perhaps it would make more sense not to allow participating (sending messages) in rooms/spaces?

ShadowJonathan commented 3 years ago

I think that bottom line, the trust between the parent and child should be focused on themselves more, rather than a surveillance-like approach which (in my opinion) will only do more harm to the kid's self-agency and self-control like that.

Parentage of this sort should come from a trust feedback loop between the kid and the parent, where the kid trusts the parent's opinion, and the parent trusts the kid to ask them. The approach to monitor their social media usage closely, imo, takes it more to the "but think of the children!" side of parentage, which (imo) is very toxic and probably detrimental to a kid's development.

If a parent truly wants to control their kid this far, they'd probably just install Element on their own phone, and have their kid log themselves in.


However, I think the idea for a "controlled account" is an interesting one in-general, whereas such accounts may be controlled by corporate IT - for whatever reason, management or legal compliance - and managed centrally, this could replicate a Slack-like environment, and somewhat bound these accounts to the corporate environment (unless explicitly allowed by "query" (by the user) and "confirmation" (by IT)).

If the above could be implemented somehow, and with nomadic/decentralised accounts, one could possibly hold onto a matrix "identity" in a multitude of ways, getting one at a corp, but "retaining" it even after they left, or joined another corp. This needs more investigation and research, but that'd be a huge boon for corperate/business use of matrix.

A "parental guidance" method could simply derive from such "controlled accounts", then, maybe through the aforementioned "query and confirmation" method, it'd allow kids some agency, but allows parents to set the boundaries with minimal privacy intrusion.

twouters commented 3 years ago

But I also understand the need for parents to have some kind of control (probably not the right word here) over what their children are doing.

The goal is not to snoop on conversations but to protect them from getting contacted / interacting with possibly abusive persona's but it's a slippery slope for sure.

all parent accounts should be able to access end-to-end encrypted communication (this might be a bit tricky because if all parent accounts are automatically joined in every room, the children will get the feeling of being continuously watched which is not what I want to achieve)

This is more of a client-side issue but I feel like I'd rather be aware of the fact that my parent can read the messages than not know.

Of course they should be aware, but I'd rather have a notice that says which parent accounts have access to all conversations that they seem necessary than having the accounts "physically" present in all rooms and having them breathe in your neck.
Worst case, parent accounts should be allowed to join conversations freely and the room events may preferably be visible to the child accounts (join/part/read markers).

twouters commented 3 years ago

Parentage of this sort should come from a trust feedback loop between the kid and the parent, where the kid trusts the parent's opinion, and the parent trusts the kid to ask them. This approach (imo) takes it more to the "but think of the children!" side of trust and bondage, which (imo) is very toxic and probably detrimental to a kid's development.

I agree for the most part. The primary reason for including the snoop permissions is to prevent abuse, but if there's another way to accomplish this (by setting proper boundaries somehow) that would be fine too.
In the example of schools the snooping part could be used as a tool to limit cyber bullying, so it could become an optional feature instead of a requirement.

uhoreg commented 3 years ago
  • all parent accounts ~should~ may be able to access end-to-end encrypted communication (this might be a bit tricky because if all parent accounts are automatically joined in every room, the children will get the feeling of being continuously watched which is not what I want to achieve)

This could potentially be solved by allowing parents to access the key backup somehow.

I think that bottom line, the trust between the parent and child should be focused on more, rather than a surveillance-like approach which (in my opinion) will only do more harm to the kid's self-agency and self-control like that.

Perhaps one possibility would be that the parent account needs to join the room in order to read messages -- so they can join a room to read old messages, but they don't have to be in the room all the time. Thus kids will know when a parent checks in on their conversation (because they'll be able to see the parent join the room), which will probably discourage parents from doing so too often, but still keeps the feature available for emergency situations.

SimonBrandner commented 3 years ago

Perhaps one possibility would be that the parent account needs to join the room in order to read messages -- so they can join a room to read old messages, but they don't have to be in the room all the time. Thus kids will know when a parent checks in on their conversation (because they'll be able to see the parent join the room), which will probably discourage parents from doing so too often, but still keeps the feature available for emergency situations.

Sounds sane :+1: But it doesn't really make my fears go away...:

If I were an LGBT kid in a homophobic family this can be very unhelpful, the same can be applied in other cases

Another pessimistic comment - sorry

izN8nu6RyeneG5XnBoBgyRMVGH6H43WF commented 3 years ago

My question is, is this actually something that belongs in the protocol specification, or would this better be an implementation feature? There's plenty of features in Synapse that aren't in the protocol, and I'm not convinced that a feature like this requires changes to the protocol, which is intended to be a messaging protocol at its core. It doesn't really care who creates or has what kind of control over an account, as long as that entity doesn't have to trust other entities besides their client and homeserver, and it's specifically left generalized to allow developers to add features that have less-than-universal appeal. This proposal amounts to "Parental Controls", which don't have super wide appeal; to people other than parents and children, it has little relevance, and even amongst parents, there's debate as to whether "walled gardens" are even a good idea. This is hardly universal appeal.

Of course, whether this feature (Parental Controls) is even compatible with the Matrix.org Foundation's principles and thus eligible to be implemented in Synapse is another question entirely. But before we discuss that, how about first we get this proposal in the right forum?

ShadowJonathan commented 3 years ago

(This is why i pivoted to "controlled accounts" in an enterprise environment, which could be the right application for an area where it's sure to be requested and used, and then to implement a small layer on top of it which pertains to parental guidance)

twouters commented 3 years ago

Well to be honest, I didn't really put much thought into where to propose/request this feature at all.
All I know is that it has its purposes, and that I'm personally not capable of forking synapse to implement it on my own and keep that fork in sync with upstream.

Clients must implement support for it as well and if a feature is specific to a certain server implementation their developers will be less encouraged to do so.

ShadowJonathan commented 3 years ago

Yeah, this is a huge undertaking (to just do parental guidance), and i dont suggest spending your time working on it to make it a reality. This issue helps a lot with putting it into focus, i hope your suggestions (and others') are considered when someone thinks of an MSC that could address this.

izN8nu6RyeneG5XnBoBgyRMVGH6H43WF commented 3 years ago

@ShadowJonathan Ah, I skimmed too fast. But even so, most desired constraints would vary drastically between varying scenarios (Parental Controls vs. Corporate) and even between different agents in the same scenario. This variety would make most forms of account control belong in an implementation, rather than in the Matrix specification. Only universally desired account control properties would belong in the specification, i.e. something at least as universally desired as "I want to ban someone from a room".

An example of such a basic feature might be to require homeservers to maintain "groups" or other basic account control identifiers. This might be required to include a "root" sysadmin group, as well as a normal end user group. We then might require servers to maintain a Directed Acyclical Multigraph representing a partial ordering between groups; each transition would represent a single API call a user in the higher group can use on a user in the lower group, etc.

Of course, this is just an example, and to get this to work would require many additional details. Rather, this is the broad, generic kind of change that might be worth putting into the Matrix specification. The exact group hierarchy, in this case, and how certain users/groups can control others is likely relegating to the server implementation.

@twouters You are right that the client has to support the feature, but Matrix was never intended to prohibit features not in the specification. In fact, it was designed to be extensible, i.e. to allow people to add additional APIs that wouldn't necessarily be supported by every client; it would then be the responsibility of the new API maintainer to:

  1. Make sure that the features extend the Matrix protocol in a way that's intended (e.g. by adding new types of events) to avoid breaking basic Matrix features.
  2. Notify users that their special features don't work with vanilla clients. Ideally, sufficiently popular features get integrated into the specification once they've become a de facto standard.

Until then, this feature still may have a place in Synapse. Note Synapse already adds additional API features for admins, e.g. shadow banning, so having an account control API may be feasible, even if not in the specification.

beercanchicken commented 3 years ago

Great idea. Addition of an account that can be supervised is a key marketable feature to enterprise. Where real-world liability is concerned, certain account types must have the ability to be surveilled, if necessary. Generally these accounts would be used as a product of a voluntary transaction, such as using an account that is an authorized communication outlet for an entity.

Children are under the responsibility of the parent or guardian, and certain legal implications of whatever the child gets into fall on the parent. The ability for a parent or guardian to check-in on a conversation allows them to handle an unfavorable situation before it gets out of hand.

These two features are similar that the common controls of the two could be adapted by the server:

This expands native support for app makers to provide clients to be developed in accordance with their target demographics. One potential use for a supervisory account would be a live chat or support module for a website (this also potentially requires other server mods not elaborated on here.) Notifications of supervisory check-in events could be handled by the client.

I see potential appeal to the supervisory levels of a corporate structure for supporting limited, supervisable accounts. This could translate into enhancing the offerings of EMS (therefore, enhancing potential revenue.)

izN8nu6RyeneG5XnBoBgyRMVGH6H43WF commented 3 years ago

@YosefSinger Seeing how the spec is a mostly technical document, including or excluding something from the spec has little effect on whether some random employee understands the system; Joe Shmoe who works at EvilCorp will almost never look up the specs of his corporation's chat client. To the contrary, the server is capable of doing all sorts of things outside of the spec. Consequently, you cannot even trust someone else's homeserver completely, which is why the Matrix.org Foundation administrators always recommend "run your own server". The focus of Matrix is achieving security and privacy via interoperability, not by eliminating trust between client and homeserver. The spec currently does require trust in the homeserver.

Until the spec removes all need to trust the homeserver, nothing you add to the spec can keep the homeserver from abusing that trust. In the case of e.g. a corporate environment, where users likely have no choice as to their homeserver, corporations are then free to abuse their users, and nothing in the spec can change that. Even if there's an alternative in the spec to the excpected E2E encryption, etc., you can't stop abuse of the alternative any more than you could the default!

Additionally, on the grounds that the Matrix Manifesto states,

The ability to converse securely and privately is a basic human right.

I vehemently oppose having anything other than E2E encryption be the standard. If you wanna have an implementation detail in Synapse to make it easier for administrators to create controlled accounts in the same manner done elsewhere in industry (e.g. admins create the username, password, have a copy of the keys as if they were the client, etc.), then that's a convenience that belongs in Synapse.

But no user, creating a new account of their own on a homeserver, should have any doubt that their communications are secure. Adding alternatives to the spec permits Matrix-compatible clients to use dark patterns, such as making the user think they're creating a personal, secure account, while actually creating another kind of (controlled) account. This is antithetical to the goal of secure and private communications. The notion that someone can just use another client here ignores the fact that, if Matrix is to be as ubiquitous as it aspires to be, the average end user won't be savvy enough to know the difference and will just use whichever client is popular. Adding this to the spec risks compounding this problem.

twouters commented 3 years ago

I believe there's currently too much focus on the snooping feature, which could possibly be mostly implemented client-side.
i.e. child/employee/whatnot accounts could purposely share a secret with their parent account or get an obvious notion in the client interface that their conversations can be accessed by their parent account(s) and I agree that this is not a requirement for the core spec.

In my opinion this is still an important feature for <13 year olds since you, as a parent, guardian or teacher still want to ensure the safety of those users in a positive way. (Think abusive/predatory adults or oppressive children)

So I'll update my initial comment to leave that out as a nice-to-have for a corporate/family targeted client feature. (Though I do have my concerns that there should somehow be a contract to mandate acknowledgement of this from the users' perspective - which I agree will be hard or even impossible to uphold.)

The discussion could focus more on which features could become part of the core spec instead, like being able to control which rooms/servers can be reached using a specific account.

There are a few possible options in my mind, depending on the situation:

ShadowJonathan commented 3 years ago

The SCT (Spec Core Team) will prioritise internally on these efforts, drafting and pushing a MSC that adds basic or near-complete support for this will fall on their radar, but they'll have more hands to address it after Spaces, yes, but it does not mean you need to wait before drafting, the earlier we'll see the full draft, the earlier the community (and some SCT members taking a glance at it) can comment on it! :D

kevincox commented 3 years ago

I think part of the problem is that it looks like the solution here will want to integrate with spaces. So onces spaces are finalized we will have a much better idea of what is needed here. Even if this doesn't interact with spaces for the v1 it would be useful to consider how it may interact in the future.

steef435 commented 2 years ago

Since I haven't seen this point so far: if user X is registered on homeserver A, then homeserver A knows:

This is all available through Synapse's admin API. (You could even write scripts to get notified when people create/join rooms etc.)

So to some extent, a homeserver admin has a bit of a handle on what their users are doing. Especially if the homeserver does not federate. Maybe just making that more accessible could be enough for a "walled garden for children" mentioned in the OP?

The only things missing are censoring the public room directory (this is not a problem in a walled garden where there are no rooms outside of the homeserver, but it would still be useful for other purposes), and snooping in on encrypted conversations (but I imagine a parent can ask their child to show them around if necessary.)

blaggacao commented 1 year ago

Today's parents will have a huge challenge to raise their children in psychological safety.

The internet is our modern jungle. As a care-taker you often times don't even know the dangers. We didn't have enough time to co-evolve, to learn — at a global pace (Ever heard of Omegle? I didn't.)

Yes, we wanna build a village, a "walled garden", where our offspring can enjoy the freedom to play in relative safety with their friends.

Where no wulfs or snakes do enter, for the wulf would be stronger and the snake would be more deceptive than that age can reasonably handle.

Learning is a gradual process.

izN8nu6RyeneG5XnBoBgyRMVGH6H43WF commented 1 year ago

Today's parents will have a huge challenge to raise their children in psychological safety.

The internet is our modern jungle. As a care-taker you often times don't even know the dangers. We didn't have enough time to co-evolve, to learn — at a global pace (Ever heard of Omegle? I didn't.)

Yes, we wanna build a village, a "walled garden", where our offspring can enjoy the freedom to play in relative safety with their friends.

Where no wulfs or snakes do enter, for the wulf would be stronger and the snake would be more deceptive than that age can reasonably handle.

Learning is a gradual process.

Most jurisdictions expect children to be of age 13 before using the internet unsupervised. This is encoded into various laws, especially in the West. If someone cannot "handle wulfs and snakes" nor even identify nor avoid them after being parented for 13 years, that suggests a deficiency in the parenting method. It is not the role of the Matrix specification to address deficiencies in parenting, but to provide a secure, private, and extensible protocol of communications built on HTTP. If it is useful for helping parents manage their children, that is coincidental, and a luxury.

If there is a social need for a "logical" or "soft" walled garden for younger children that spans homeservers, this can be handled by adding account control and public directory whitelisting in the reference implementation (Synapse), as steef435 noted. Since some voices in this thread clearly want someone else to build their particular brand of walled garden for them (never mind the maintenance costs of merely adding configurability), they should create an issue on the Synapse repository, not the spec nor proprosal repos.

Even if the spec was updated to require support for controlled accounts, most homeserver projects would simply ignore it, whether for complexity or ethical reasons. This would be unacceptable, as a spec that isn't respected cannot be a standard. But even if we weakened the requirement by allowing the homeserver to respond, "controlled accounts are unsupported", that is a design anti-pattern. Matrix is specifically designed to be an extensible protocol, so optional features should be in a new namespace.

But even if it were desirable to put this particular account control functionality into the core spec, there is no basis for including this model for account control, but not others. Ultimately, the "wall" in "walled garden" refers to a security model, and each service and user has their own unique idea of what a "wall" is. If we included even 10% of the configuration rules required to account for most "walls" people want today, it would become so bloated that account control would eclipse the rest of the spec. Bloated specs are just as bad, if not worse, than bloated software.

In any case, controlled accounts do not belong in the core spec.

blaggacao commented 1 year ago

In any case, controlled accounts do not belong in the core spec.

Are you challenging the requirements?

I read a concoction of implementation and maintainability concerns. I don't agree citing these as "retroactive" arguments to challenge the requirement.

I also read a recursion to laws to obliviate the need. This is wired in every aspect of practicality.

izN8nu6RyeneG5XnBoBgyRMVGH6H43WF commented 1 year ago

blaggacao said: Are you challenging the requirements?

The supposed need to have walled gardens for children or corporate controlled accounts in the spec is not a "requirement" just because you say so. It does not become a "requirement" for the Matrix spec until the Matrix spec team says it is. There are no requirements to challenge, so the rest of your post is largely moot.

Anyways, my arguments about "core spec" do not preclude the feature. They only concern what systems it should be implemented in, imploring the reader to take the discussion to the matching forum. In this case, I stated technical arguments for why it should not be in the protocol, but in Synapse, and suggested creating an issue in the Synapse repository. As the arguments are entirely technical, I expect a technical counterargument for why e.g. this feature must be implemented in all Matrix systems, per the protocol, instead of just those within a walled garden or private network. Any further discussion in your role as a stakeholder (perhaps as a parent of young children?) is out of scope and out of touch with the technical reality of Matrix.

If you feel you can actually address my technical points, please do, in a way that does not violate Matrix's design principle of homeservers not being required to trust other homeservers.

blaggacao commented 1 year ago

@izN8nu6RyeneG5XnBoBgyRMVGH6H43WF Hm, maybe my answer did conceal, rather than reveal, my desire for material progress towards a solution for that need?

In between the lines of your replies, I see some hints to pick up, however, they are largely burried in prose that reads like fundamental opposition to the need (a.k.a. requirement). So at this point I apologize and don't really know what to do next.

Every parent is a potential stakeholder and so am I. Matrix is the yet best answer to asynchronous communication for the evolving homo digitalis.

I fathom, beyond technicalities, a stance that acknowleges the needs of upbringing are in-scope for such a fundamentally human-centric approach of this great piece of technology.

I say this from the vantage point that I'd never would wish (for myself and my family & friends) to use a different protocol for async (and even sync) comms and with a strong beleive in it's fundamental values and it's future overarching success.

turt2live commented 1 year ago

For clarity, this issue still being open at least signifies that we (the Matrix team) believe that there's enough merit to the idea being considered for the specification in some capacity - it would otherwise be closed as a great thing for an implementation to do. However, where or how it gets incorporated into the spec is yet to be determined - an MSC (likely following further conversation here on this issue) would help answer that particular question.

All told, it's a bit early to be shutting ideas down so strongly. There are legal requirements some server operators will be under, and there are reasonable expectations that every server behave roughly the same for such accounts - this issue is meant to serve as a discussion place for this sort of feature, not as a place to deny others from contributing.

twouters commented 1 year ago

Initially this issue was created with the word "child" in the title, but it has been removed from the title to reduce the controvercy around it. (Legality and other things I don't know anything about.)

Things might become easier if we focus on the corporate/enterprise application and if people want to use these features to accomodate their family's needs, that's up to them.

Looking at the current state of spaces, the implementation might not be too far from "just" preventing space members from leaving a space or initiating contact with outsiders (users or spaces from the same or other homeservers) and admin accounts within that space to approve contact with specific outsiders.

blaggacao commented 1 year ago

Preventing, both, breaking in and breaking out, seems indeed like pretty good perimeter security that encapsulates an environment of known (and hence accountable) counterparties.

That would probably constitute a "trusted environment", a "village with a palisade", a "walled garden".

Snooping capability is only required if even that environment becomes untrusted, which might be true for some business use cases, but not necessarily for a majority of the social contexts in question (that I have in mind).

All said, it's noteworthy that the subject that we'd likely target via a spec is the account identity. Binding account identity to real identity are likely out of scope.

But even that might be eventually required (for accountability, KYC). Once we enter this realm, we'd be talking about attested DiD (decentralized identity).

izN8nu6RyeneG5XnBoBgyRMVGH6H43WF commented 1 year ago

blaggacao wrote: But even that might be eventually required (for accountability, KYC). Once we enter this realm, we'd be talking about attested DiD (decentralized identity).

Speaking of, the W3C recently approved the recommendation to use DIDs, with certain requirements for semantics. However, these could even be used with federated identities in Matrix today.

If we are considering using W3C DIDs for our identifier semantics, for future-proofing purposes, maybe we should consider how this issue's concept of "Controlled Accounts" fits into DID semantics, esp. the spec's "DID Controller"? I'm interested in DID creation, which corresponds to a "counterparty" authorizing a Controlled Account.

Maybe this should be part of the Matrix spec after all, since, if we used W3C DID semantics, it would be part of the Matrix DID method, which is part of the protocol?

joepie91 commented 1 year ago

In my opinion this is still an important feature for <13 year olds since you, as a parent, guardian or teacher still want to ensure the safety of those users in a positive way. (Think abusive/predatory adults or oppressive children)

Please consider that the vast majority of child abuse originates from "trusted" familial circles, not random strangers or other kids. "Controlled" accounts would provide more power over the child to precisely those people most likely to abuse them, even if you ignore all of the dangers that this poses to marginalized folks.

Also, keep in mind that snooping is only one of the abuse vectors. Controlling who a child can talk to, very much is another (and in fact is pretty much a staple of child abuse).

I would therefore be vehemently opposed to adding a feature for "parental control" like this to the protocol (or server implementations, for that matter) in any form - it is far more likely to harm children than "keep them safe".