network-quality / draft-ietf-ippm-responsiveness

20 stars 7 forks source link

HTTP/2 Request and response prioritization #48

Open LPardue opened 2 years ago

LPardue commented 2 years ago

This is most relevant when probe requests are made on existing connections. It's touched upon in the text

At the HTTP/2 layer it is important that the load-generating data is not interfering with the latency-measuring probes. For example, the different streams should not be stacked one after the other but rather be allowed to be multiplexed for optimal latency.

But I think you probably need to say more about this as other HTTP/2 implementations come into the mix.

cpaasch commented 2 years ago

@LPardue - what comes to your mind here? Do you have some suggestions?

LPardue commented 2 years ago

So I think it would help to clarify what is meant by "different streams stacking one after the other". Are you talking about the requests, the responses, or both?

For instance, if the client is doing a large upload and sending probe requests, it will want to prioritize sending the probe requests as soon as possible, in order to do as many probes within the test time period. The server doesn't have any real competition for sending responses to those probe requests.

In the reverse direction, if a client has requested a large download it can easily emit probe requests (no contention) but the server is going to want to respond to probes ASAP, so that it looks good.

Some servers follow client-provided signals of response prioritization. So you could recommend that probe requests are sent with high priority (using whatever scheme implementations choose).

Scheduling of requests on the client-side is not really standardized. Different browsers do different things, which can be based on content type or request context. Tuning that can be tricky or impossible.Somebody trying to send probes and large uploads might be surprised by how those things contend. So we should probably just highlight in considerations that this can happen, and what problems to look out for (skewed results or whatever)