Open korydraughn opened 10 months ago
You can utilize a http::buffer_body to send the response part by part. This example might help you: https://www.boost.org/doc/libs/1_84_0/libs/beast/doc/html/beast/more_examples/send_child_process_output.html.
If you are reading from a file, you might want to consider using http::file_body, which performs all these steps automatically.
Alternatively, you can create a custom body type that reads from a stream into a fixed-size buffer and returns the corresponding buffers until it reaches the end of the stream.
Thanks. I'll give that a try.
The send child process output example was exactly what I needed.
Perhaps adding an async example that demonstrates the same thing would be helpful to others. Mostly to show how to manage the lifetime of the response_serializer
across async calls.
Question
I'm using Beast 1.81 with Clang 13.0.0 to present a distributed data management API over HTTP.
I've been reading a lot of the documentation and it's not clear to me what the recommended solution is for streaming large amounts of data back to the client. The pages that look like they are close to what I want are:
At the moment, my application presents something similar to POSIX read. That is, it forces the client to make multiple read requests to the HTTP server. This isn't ideal, but is good enough for a first pass (I'm still getting familiar with Beast).
You can view the code in question here:
The application has a background thread pool for long running operations. I want to use that to incrementally stream the data back.
Q. Does a custom body type make sense here? Or would a
response_serializer
be better? Q. Istransfer-encoding: chunked
automatically supported by custom body types? Q. Are there examples which demonstrate how to do this asynchronously?BTW, thanks for the awesome library.