Open jenskueper opened 1 year ago
@jenskueper I'm curious, how would you use this in your application?
I see two use cases for us. The first one is for bref console/worker to have real time output for long running tasks. The other is that currently kernel terminate in Symfony based projects in running synchronously in bref which is different from a non serverless setup. With response streaming we could change this behaviour and archive faster response times to the client while still offering the benefits of offloading IO intensive tasks to the terminate event.
Another use case is to create an HTTP endpoint using AWS Lambda and API Gateway to show log details from CloudWatch Logs. The endpoint won't work if there are too many logs that exceed the limit of Lambda response.
@deminy I'm not sure I understand, why not query the CloudWatch API directly?
@mnapoli There are various log groups for different AWS resources. When issue happens with some of the services, we'd like to have an easy way allowing people to quickly get the logs. People could come from different teams, and not everyone has direct access to CloudWatch. As a result, an HTTP endpoint serves as a convenient solution for this objective. I hope that clarifies the use case.
Another use case I am interested in: streaming responses from secondary APIs that return their result with HTTP streaming, such as OpenAI.
This is not only "cool", but without it, you will quickly encounter response timeouts on longer result sets.
I made some tests to see how to implement that feature, but I have trouble with fastcgi_finish_request
function not defined when running a symfony app in docker bref/php-82-fpm-dev
. Any idea why ? PHP_SAPI returns fpm-fcgi
@jlabedo Make sure your process runs in FPM, bref/php-82-fpm-dev
supports many ways to run PHP.
PHP_SAPI returns fpm-fcgi
That does sound like FPM though :) it's surprising 🤔
@mnapoli I tested with an http request and not via cli, I was also surprised and tried to tweak php-fpm.conf in bref's container with no success.
I'm not sure this is related, but I'm very frustrated right now since I'm trying to use Laravel's response()->streamDownload()
function to force the file download of a csv report file. I don't want to have to write the csv to disk to use the response()->download()
. Under the hood Laravel uses Symfony's StreamedResponse
class. Streamed responses will return custom headers and custom HTTP response codes, but the actual content is always empty. Is this related to this issue? If not, my apologies and I'll open a separate issue.
@selfsimilar it is related yes, right now what you are trying will not work.
If anyone wants to sponsor the development of this feature, get in touch (matthieu at bref.sh)
I see two way of supporting it
for us the use case would be the following
i.e something like this
html
layout
loading div
actual data
<script remove loading div>
actual data may require external api call, or heavy request , or maybe of several block
modern browser with classical html pages can be muchhh more responsive with much less work on the front end if you use streaming response as then in our symfony controller we could do
->send($layoutBegin")
// heavy stuff
->send($block1)
// more heavy stuff
->send($block2)
->send($layoutEnd)
so in that case the user could start to see the page very quickly (10ms ?) and block1 fairly quickly (50 ms) even if block2 takes 3 seconds (and maybe the user will not scroll down there anyway) and this without needing a single line of javascript
I made some tests to see how to implement that feature, but I have trouble with
fastcgi_finish_request
function not defined when running a symfony app in dockerbref/php-82-fpm-dev
. Any idea why ? PHP_SAPI returnsfpm-fcgi
Finally got it by chance, it is a disabled function in bref.ini. It will be easy to make it work ;)
; The lambda environment is not compatible with fastcgi_finish_request
; See https://github.com/brefphp/bref/issues/214
disable_functions=fastcgi_finish_request
I forgot to mention that I discovered that the current version of Bref CAN stream responses, at least using Livewire's response()->streamDownload()
function (v2). Not sure why it works with this library but the following is working on current Bref:
<?php
use Livewire\Component;
use Symfony\Component\HttpFoundation\StreamedResponse;
class streamExample extends Component
{
public function exampleStreamedCSVDownload(): StreamedResponse
{
$filename = 'example.csv';
// Make some dummy data
$events = [];
for ($i = 0; $i < 100; $i++) {
$events[] = rand(1, 100);
}
return response()->streamDownload(function($events) {
echo "date, event\r\n";
foreach ($events as $event) {
echo "'2023-10-13', {$event}\r\n";
}
}, $filename);
}
public function render()
{
return <<<'blade'
<button wire:click="exampleStreamedCSVDownload">{{ __('Download CSV file') }}</button>
</div>
blade;
}
}
@selfsimilar does it work in the sense "it does not throw an error" or in the sense it's actually streamed (i.e it does not get buffered by bref/lambda ) ?
That's a really good question. It works in the sense that it:
StreamedResponse
library.However, I do not know if bref/lambda buffers the contents. I haven't tried downloading large enough files to test the limit of the buffer. But as per my earlier comment in July, trying to return a StreamedResponse
directly via Laravel would always return filled headers but an empty content response. So my goals were met with Livewire and I didn't press too hard to determine if it was an 'authentic' streaming response.
@selfsimilar it does work, but does not use "Lambda response streaming".
Bref needs to be refactored heavily to support this (it needs to use different APIs from the Lambda runtime API). I'm looking into it right now (how much time it would take), this isn't a small task.
@selfsimilar ok because one year ago when i tried it was also "not throwing errors" (which was good enough for me) , the way I tested it was to add a sleep(5) between the for loop iteration and see on curl the chunk appearing in one go (when using lambda ) vs one at a time
@bnusunny thank you, that blog post provides very useful information that is not contained in the docs AFAIR (e.g. the mention of the NULL characters).
In any case the hard parts are "implementing these new APIs" + "refactoring the internal design of Bref" to support this new approach (not mentioning the heavy testing this implies to avoid any regression). I've estimated this to take several weeks of work, which is not a light task.
@bnusunny one question, from your blog post:
In Lambda Function URLs, multi-value HTTP headers are not supported
Is this the case only with response streaming, or also with "normal" responses? Any way to work around that, for example returning header values separated by commas? To my knowledge API Gateway's 2.0 response format allows multiple headers:
doesn't that work here?
No, Lambda Function URL does not support that. Duplicated headers will override previous values. But for cookies, you can provide a list of values. The metadata prelude structure has the details.
/// Metadata prelude for a stream response.
#[derive(Debug, Default, Serialize)]
#[serde(rename_all = "camelCase")]
pub struct MetadataPrelude {
#[serde(with = "http_serde::status_code")]
/// The HTTP status code.
pub status_code: StatusCode,
#[serde(with = "http_serde::header_map")]
/// The HTTP headers.
pub headers: HeaderMap,
/// The HTTP cookies.
pub cookies: Vec<String>,
}
AWS has announced support for response streaming in Lambda. This would reduce TTFB and allow larger response payloads.
Blog post: https://aws.amazon.com/blogs/compute/introducing-aws-lambda-response-streaming/
It's also supported for custom runtimes: https://docs.aws.amazon.com/lambda/latest/dg/runtimes-custom.html#runtimes-custom-response-streaming
Would be awesome if we can adopt this in bref 🚀