Open LPardue opened 2 years ago
@LPardue - what comes to your mind here? Do you have some suggestions?
So I think it would help to clarify what is meant by "different streams stacking one after the other". Are you talking about the requests, the responses, or both?
For instance, if the client is doing a large upload and sending probe requests, it will want to prioritize sending the probe requests as soon as possible, in order to do as many probes within the test time period. The server doesn't have any real competition for sending responses to those probe requests.
In the reverse direction, if a client has requested a large download it can easily emit probe requests (no contention) but the server is going to want to respond to probes ASAP, so that it looks good.
Some servers follow client-provided signals of response prioritization. So you could recommend that probe requests are sent with high priority (using whatever scheme implementations choose).
Scheduling of requests on the client-side is not really standardized. Different browsers do different things, which can be based on content type or request context. Tuning that can be tricky or impossible.Somebody trying to send probes and large uploads might be surprised by how those things contend. So we should probably just highlight in considerations that this can happen, and what problems to look out for (skewed results or whatever)
This is most relevant when probe requests are made on existing connections. It's touched upon in the text
But I think you probably need to say more about this as other HTTP/2 implementations come into the mix.