Open Brandr0id opened 4 days ago
I don't understand what you mean by "what capabilities the client (browser) requires for a private auction to occur". Can you say more about what party uses the header and how?
Sure! Effectively this gets sent when a fetch or iframe request is made to the untrusted seller server in front of the SFE service. With this data a seller can determine if the request needs to go to say a Protected Audience or Ad Selection (or potentially other) SFE instance. Additionally if the browser's encrypted payload contains data only suitable for a certain version of the B&A (e.g. version 3.11.0.0 in our case is the first version) a seller can know if they haven't deployed at least that version yet the request cannot be handled properly. Presumably there will be grace periods and deprecation of API fields etc as this all evolves but as fields are removed or breaking changes occur this would let the client be explicit to the server side about the min version this auction requires to function or run successfully.
Ah I see. But then why would you want this to be an HTTP header? This should only be sent with very specific requests, but it seems to me like the browser wouldn't even know which requests these are.
Ahh yeah, sorry to clarify we'd ideally want to scope this to where the current B&A APIs interact with the Fetch JS API and iframe element via the adAuctionHeaders
request attribute and element attribute. Currently to handle a server auction in runAdAuction the browser must observe the response and reference the request seen/sent by the browser. This is gated by the adAuctionHeaders attribute so we have a good gate as to when the browser is issuing an HTTP request that is tied to a remote server auction.
Aha! Okay, that makes more sense. You're asking for a new request header on fetch('https://www.example-ssp.com/auction', { adAuctionHeaders: true, … })
requests.
Heh, but now I have two more question:
Why not just have the platform support string be another property of the AdAuctionData
object that you must be getting from navigator.getInterestGroupAdAuctionData()
in the first place? That object contains requestId
and request
fields already, so including request metadata of some sort seems natural.
Why is your proposed header Sec-
? That means it can't be set by JS code, only by the browser. Is there some risk associated with malicious JS forging this header?
I'm asking these because if your field could be set via JS (Q2), then an ad tech's code could take the string from the AdAuctionData
object (Q1) and stick it into the header you're asking for. But that same code could also make on-device decisions based on the metadata. I think your design is less flexible in that only the server has the ability to make behavior changes.
The data could potentially be exposed as metadata on AdAuctionData but unlike the encrypted request field and the requestID it is typically stable, long-lived platform information for the duration of a browsing session (vs per-request data). As for Sec-, this header is platform information that a caller really should need/want to set (think client hints, UA, etc) and would allow the receiving end to have some confidence in format, values along with letting us Grease the header value.
I think the typical use-case if the data were only on the AdAuctionData object would be a seller requesting the AdAuctionData, then immediately having to add the header themselves to the request. Ideally we can make this as ergonomic as possible and just have the header auto-added so the receiving end knows what to do and where to send the request.
This being said, if we think there's additional benefit in some of these scenarios in also making a client side determination it seems potentially reasonable to expose this via the AdAuctionData additionally with the platform added header.
Hmm sure, if it is something the server needs all the time, your position makes sense.
Can you say a little more about what kinds of behavior changes you think this will trigger server-side, though?
It seems to me like it cannot be used to pick between different trusted servers to send the blob to, because those different servers would be holding onto different cryptographic keys, and the relevant keys needed to be specified when you called getInterestGroupAdAuctionData()
in the first place.
I also feel like it wouldn't be used to affect the behavior inside the trusted server, because a setting for that ought to be inside the encrypted blob, not in plaintext.
That leaves changes in the behavior of the untrusted server in the flow, but I don't have any examples of such a change to help me think about what this means in a practical sense.
From the Chrome POV, I don't know what value we would set this string to, or when we would ever change it. Maybe some samples of how you're thinking of using it in Edge would help?
We have two main scenarios in mind with this:
1) The different trusted servers scenario. Common client code in say Chrome, Edge, or another similar browser supporting this flow could make a common fetch(...) call to the untrusted seller server with common realtime and contextual bidding logic. The seller can then use this header to determine what action to take such as sending the encrypted blob to a compatible cluster (in this case either Protected Audiences or Ad Selection API) rather than sending the encrypted blob and seeing which instance can decrypt or handle the request.
2) Breaking API changes. Currently there is an implicit API for the encrypted blob where the browser packages a set of data, encrypts it and then the SFE+other services rely on that info to complete the auction. There have already been a few fields that are deprecated that the browser sends to the server auction. While there would be grace periods ideally where browser versions send both fields and eventually drop the old/deprecated field there will be a point in time breaking changes occur such as deleting a required field. If a sufficiently old version of the server instances are running they will no longer receive the old field and fail to process the auction. This gives us a clear indication of the min version of B&A that is required to successfully run an auction based on the payload sent by the browser.
For example if a Seller is running v3.11.0.0 of B&A, then in say Chromium m140 we stop sending an important/field as there was a replacement added in m133 and the B&A services started supporting this new field in v4.4.0.0 then in m140 (or even 139 as an escape hatch to the change) the min-version the browser would send would be incremented to v4.4.0.0 to indicate any requests coming from these browsers must be sent to at least that backend version. A seller who has not updated or deployed at least that version can choose to fallback to other means or even fire telemetry to indicate any previously delated update must be prioritized based on incoming traffic (e.g. ramping up as the browser deploys to the Can <> Stable audiences).
I think for Chrome, right now the value would be likely set to the lowest B&A version that is both currently supported + compatible with the encrypted data schema that is sent today.
Heh: Your scenarios are the two that I said would not benefit from this feature, in my previous comment, so I clearly have some learning to do 🫤.
If you don't mind, let's focus on your scenario 1, two different trusted servers.
The way we've designed the B&A flow is that at the time you invoke the getInterestGroupAdAuctionData()
API to get the auction blob, you need to specify the 'coordinatorOrigin'
for your server-side auction. That's because the API call is telling the browser which coordinator keys it should use to encrypt the blob, so that only the right TEE can decrypt it.
So I definitely would not expect servers "sending the encrypted blob and seeing which instance can decrypt or handle the request". Instead I would expect the API caller deciding what server-side auction it planned to run, and then having that decision affect both the coordinator choice in the API call and the contents of the fetch itself.
The flow I'm describing works for our current situation, with two different coordinators each with their own key sets (that is, AWS keys are different than GCP keys). So what is new and different that I'm missing?
Sure, let's dive in on that scenario! That all sounds accurate and aligns with my understanding of the current world too. Potentially a developer could, on the browser, keep track of which keys/coordinator it is using and which backend deployment(s) that corresponds to and then for the fetch() scenario add other headers or metadata in the request body to signal to the untrusted SFE which backend to send the request to. For iframe scenarios I think it gets a bit harder, the query string/params might have to be used to pass that information along but still doable.
For ad tech that choose to use the default coordinator of the browser that parameter is currently optional and does not even need to be specified. This would allow adtech to have common code like navigator.getInterestGroupAdAuctionData('seller': 'https://ad-tech.example')
that 'just works' in any browser supporting the spec. However, the default coordinator will likely be different for a number of reasons. The addition of this header would both allow us to decouple the platform requirements/capabilities (e.g. I need a Protected Audience
, Ad Selection
, or, other future solution
capable private auction instance with min-version A.B.C.D) from the key functionality and improve the ergonomics so developers don't have to think about it client side and can just look for the header and choose what to do.
On a sort of related note, with the direction coordinator/kms grainularity is heading towards a single cross-cloud coordinator it seems more likely in the future that the coordinator won't need to be specified at all as keys will likely be unified between say AWS, GCP, Azure, etc for a given browser.
@michaelkleber, @JensenPaul
One of the things we've added in the Ad Selection API is a platform header to indicate to the B&A seller side what capabilities the client (browser) requires for a private auction to occur.
Example:
Can we add this to the Chromium impl as a way for any seller to understand what platform support they would need to handle a given auction? We feel this would help with both different potential server images as well as provide a mechanism as the API and underlying auction data structure evolves if there are any breaking changes or required additions that old versions may not be compatible with.
Once we align on if/what is to be added happy to move this over to a crbug and/or have patches put up to allow any embedder to specify platform details.