Open winstrom opened 1 year ago
Hi @winstrom - just to confirm, the "we" here is Apple?
Also, great to have these docs! Could we add them in via a PR?
Hi @AramZS -- yes "we" is Apple WebKit. Sorry for not being clear.
Happy to add via a PR or to add them to a repo in https://github.com/patcg-individual-drafts (see https://github.com/patcg/proposals/issues/16)
We try to refer to ourselves as Apple WebKit in the web standards context. My preference would be to use that.
@winstrom Ah, I see that request, I have added you as a member to patcg-individual-drafts you should have access there to set up a repo now. Let me know if that doesn't work.
Thanks @AramZS -- files have been added to https://github.com/patcg-individual-drafts/private-ad-measurement
We have a proposal to allow measuring attribution of advertisements with privacy guarantees.
We try to build on previous privacy proposals such as Private Click Measurement (PCM) , Interoperable Private Attribution (IPA), and Attribution Reporting API with Aggregatable Reports (ARA). Our goal at each stage is to only transmit the minimum information necessary to perform the attribution measurement and nothing else.
Like PCM, we rely on the user’s device to join an advertisement impression and conversion together. This means that the browser is trusted with the event level information on user interactions and joins them into summaries of attribution represented as histograms. These histograms only contain the attribution value of a conversion rather than a browsing history.
Like IPA, we rely on Multi-Party Computation (MPC) frameworks to cryptographically segment data across multiple computation partners so that no individual organization can track an individual. This system is used for both aggregation and to introduce Differentially Private Noise and ensure that there is a well defined privacy loss bound for each user of the system. We rely on the Prio (Prio | Stanford Applied Crypto Group) framework to perform this multi-party aggregation. We rely on Differential Privacy ((Differential Privacy, The Algorithmic Foundations of Differential Privacy)) to add appropriate noise to attribution calculations to make them private.
Like ARA, we wish to allow measurements across a large, sparse space defining the potential linkages between advertisers and publishers. We present a concrete way to encode this sparse space using dense histograms so that individual contributions can be aggregated using known MPC approaches.
For a readable overview, we provide an explainer
We also provide a document with more details