Open chrisn opened 8 months ago
I think that depends on how the PDS is architected. Some PDSs are basically stores with an ACL layer, and once you've granted access the horses have fled the barn. But an alternative (and I would say better) design is one in which the code comes to the data and runs in a sandbox from which it cannot exfiltrate data. #26 hints at this, though it's not made explicit.
Section 4.2 talks about personal data stores. These can give user a repository of their data; they don’t necessarily prevent other actors also having copies of that data, and using it for AI training purposes.