You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using Beast 1.81 with Clang 13.0.0 to present a distributed data management API over HTTP.
I've been reading a lot of the documentation and it's not clear to me what the recommended solution is for streaming large amounts of data back to the client. The pages that look like they are close to what I want are:
At the moment, my application presents something similar to POSIX read. That is, it forces the client to make multiple read requests to the HTTP server. This isn't ideal, but is good enough for a first pass (I'm still getting familiar with Beast).
The application has a background thread pool for long running operations. I want to use that to incrementally stream the data back.
Q. Does a custom body type make sense here? Or would a response_serializer be better?
Q. Is transfer-encoding: chunked automatically supported by custom body types?
Q. Are there examples which demonstrate how to do this asynchronously?
BTW, thanks for the awesome library.
The text was updated successfully, but these errors were encountered:
If you are reading from a file, you might want to consider using http::file_body, which performs all these steps automatically.
Alternatively, you can create a custom body type that reads from a stream into a fixed-size buffer and returns the corresponding buffers until it reaches the end of the stream.
The send child process output example was exactly what I needed.
Perhaps adding an async example that demonstrates the same thing would be helpful to others. Mostly to show how to manage the lifetime of the response_serializer across async calls.
Question
I'm using Beast 1.81 with Clang 13.0.0 to present a distributed data management API over HTTP.
I've been reading a lot of the documentation and it's not clear to me what the recommended solution is for streaming large amounts of data back to the client. The pages that look like they are close to what I want are:
At the moment, my application presents something similar to POSIX read. That is, it forces the client to make multiple read requests to the HTTP server. This isn't ideal, but is good enough for a first pass (I'm still getting familiar with Beast).
You can view the code in question here:
The application has a background thread pool for long running operations. I want to use that to incrementally stream the data back.
Q. Does a custom body type make sense here? Or would a
response_serializer
be better?Q. Is
transfer-encoding: chunked
automatically supported by custom body types?Q. Are there examples which demonstrate how to do this asynchronously?
BTW, thanks for the awesome library.
The text was updated successfully, but these errors were encountered: