Open jeffutter opened 5 days ago
Hello @jeffutter
streaming HTTP response bodies? I see ClientStream.recv_res
For servers, wtx
doesn't support HTTP/2 Server Push because Google removed it from Chrome (https://developer.chrome.com/blog/removing-push?hl=fr), as such, it is not possible to send arbitrary numbers of responses to clients.
// https://c410-f3r.github.io/wtx/http2/index.html
Passes the hpack-test-case and the h2spec test suites. Due to official deprecation, server push and prioritization are not supported.
For clients, wtx
only supports the sending of requests to servers and I am not aware of any other projects that allow clients send responses.
create a request and stream the response body.
Contrary to HTTP/1, an HTTP/2 connection can live indefinitely so when a stream is opened, both parts can theoretically transfer as much data as needed. One caveat is that this interaction is an "one-shot" semiduplex-like (https://en.wikipedia.org/wiki/Duplex_(telecommunications)) scenario.
I have a feeling that you looking for a full-duplex communication between the client and the server. If that is the case, then WebSockets over HTTP/2 streams should be enough (https://datatracker.ietf.org/doc/html/rfc8441).
Unfortunately wtx
doesn't support such a feature at the current time and I am not sure when an implementation will be available.
Another thing worth mentioning is the fact that wtx
is still experimental and shouldn't be used in production environments.
Cheers
Thanks for the quick reply. I should have provided some examples of what I'm asking about for clarity as some of the terms (like "stream") are pretty overloaded.
I'm looking for unidirectional communication where the client can read bytes of the response before the full response has been delivered.
A common example might be 'streaming' a video file. You have one request to the server and one response, but you don't have to wait until the entire file is downloaded before you can do anything with it.
ClientStream.recv_res
returns a ReqResBuffer
which I think has the entire body buffered up into data: Vector<u8>
, and it then closes the http/2 stream.
What I'm looking for is more like hyper
's SendRequest.send_request
which returns a Result<Response<IncomingBody>>
on which you can call poll_frame()
(or frame()
from http-body-util
BodyExt
). This gets you a chunk of the body that you can use before the entire request is finished.
FWIW, my actual use-case is to implement a client for Apollo GraphQL's http-multipart graphql subscriptions since they don't support subscriptions over Websockets.
Hope this clears up the ask.
Thanks!
Thank you for the clarification.
ClientStream.recv_res returns a ReqResBuffer which I think has the entire body buffered up into data: Vector
, and it then closes the http/2 stream.
Yeah, that is correct. All data must be available in advance.
The underlying machinery automatically splits the data according to https://datatracker.ietf.org/doc/html/rfc9113#SETTINGS_MAX_FRAME_SIZE and sends each block concurrently respecting flow control parameters. This approach is pragmatic but lacks flexibility, as you noticed.
What I'm looking for is more like hyper's SendRequest.send_request which returns a Result<Response
> on which you can call poll_frame() (or frame() from http-body-util BodyExt). This gets you a chunk of the body that you can use before the entire request is finished.
Oh, I see. Fine-grained or "low-level" stream operations weren’t added because no one asked until now :)
https://github.com/c410-f3r/wtx/issues/243
I plan to add this new functionality in the next version of wtx
which will probably happen in approximately 20 days. Nevertheless, you preferably should use hyper
or h2
to implement the client for Apollo GraphQL's.
Hi there,
Very interested in what you are doing here with
wtx
.I was wondering if there is support for streaming HTTP response bodies?
I see
ClientStream.recv_res
, but that says.which is not what I want.
It does indicate "Higher operation", but I can't really seem to find any lower-level client APIs to create a request and stream the response body.
I'm wondering if I'm overlooking something here. Or, if it doesn't exist, are there plans to add such APIs?