brefphp / bref

Serverless PHP on AWS Lambda
https://bref.sh
MIT License
3.16k stars 365 forks source link

Feature: Support Swoole #306

Closed viezel closed 1 year ago

viezel commented 5 years ago

Swoole has various very interesting concepts. Its crazy fast, async everything (php, mysql, redis etc), has support for coroutines og websockets.

I opened a ticket at Swoole repo to support Lambda - but it seems its the other way around. see comment: https://github.com/swoole/swoole-src/issues/2422#issuecomment-482910249

new Process(['swoole-cli', 'index.php']);

in index.php

$server = new swoole_server('swoole.sock', 0, SWOOLE_PROCESS, SWOOLE_UNIX_STREAM);
$server->set(['open_http_protocol' => true]);

@mnapoli interested in this supporting Swoole?

nealio82 commented 5 years ago

You could definitely create your own runtime to use Swoole. I think you'd need to create your own layer which just includes your own bootstrap file to set up the Swoole environment, and you could probably use it in conjunction with Bref's base php layer.

Have a look at the Bref console layer as an example of the minimum you need in order to put a runtime on top of the base Bref php layer

mnapoli commented 5 years ago

@viezel I'm not sure it makes sense in Lambda.

Swoole is awesome at supporting high concurrency, especially thanks to async.

In Lambda there is no concurrency: one request per lambda at a time. So the Swoole model wouldn't change much (I guess) because the request will still take the same time to execute.

viezel commented 5 years ago

so the Bref\Runtime\LambdaRuntime will never be reused between requests? I was by the understanding that lambda executed did reuse the runtime layer, and therefore not having cold starts all the time, right?

If this is the case, then swoole is easily 5x compared with php-fpm in turns of handling requests. But, it seems im understanding lambda execution model incorrectly?

mnapoli commented 5 years ago

I am not an expert with Swoole so I might be missing something.

so the Bref\Runtime\LambdaRuntime will never be reused between requests?

Yes it will be reused. After the cold start the same lambda will process other requests (and keep PHP-FPM alive). What's important to understand is that 1 lambda process 1 request at a time. There is no concurrency.

What I'm missing is that PHP-FPM has like 2-3ms overhead (https://github.com/brefphp/bref-benchmark) so how can Swoole be 5× faster?

I understand that it can handle more concurrent connections for a similar machine (thanks to async), but I don't get how it is faster?

nealio82 commented 5 years ago

Oh, I just looked a bit more at Swoole. I was under the impression it was a bit more React PHP-ish (ie could be included with composer and run) rather than a package you needed to install with pecl. In that case the Lambda setup would be a little more complex.

viezel commented 5 years ago

If there is no concurrency in lambda, then Swoole will not gain much improvement.

@nealio82 Swoole is a server component - a replacement for nginx and php-fpm

mnapoli commented 5 years ago

Closing until/unless we can find a compelling reason for supporting Swoole.

windrunner414 commented 5 years ago

may be some benefits, like concurrent client (mysql, redis...) and reduce framework(like laravel) initialization overhead.

Because swoole is memory-resident so there is no need to initialize the framework for each request Do you know laravel-swoole ? The performance is 5x faster than laravel in php-fpm(only echo hello world)

But I'm not sure if it's useful enough

nealio82 commented 5 years ago

What would be the difference between loading the framework into memory using Swoole vs loading the framework into memory using React-PHP? It could be fun to try creating runtimes for each of these

windrunner414 commented 5 years ago

Exactly the same.There are all php-cli.I'm not sure if there's a framework like laravel-react, but I think create runtimes for them is very similar

swoole provide binary package to deployment simply.https://www.swoole.com/page/download also can pre compile swoole.so and add into php-cli config

mnapoli commented 5 years ago

In those scenarios we would get the best performances with the execution model of Bref 0.2 (the code was removed from the repository but can be easily be retrieved the day we want to bring that back). It's basically like React/Swoole (everything in the same process), what we need to do is map the API Gateway request/response to PSR-7. That avoids any specific framework (React, Swoole).

I haven't looked into this for lack of time, and because atm my priority is making sure PHP-FPM works well. But in the future I'm certain there is a lot of potential there!

windrunner414 commented 5 years ago

make sense. swoole will support fastcgi in the future,I don't think bref needs to modify anything to use swoole, right?

leocavalcante commented 3 years ago

IMO the project can still benefit from Swoole by allowing functions to run internal I/O concurrently. I mean: imagine that your lambda's job is to call two other APIs before returning a response, an imagine that each of these APIs takes an hypothetical 1 second do finish. With regular PHP the total lambda time will be no less than the sum of the two API calls: 2 seconds. But with Swoole you can move each request to its own Coroutine and run the requests inside de lambda concurrently and instead of taking 2 seconds to finish you will take only 1. This impacts directly in the lambda costs as well.

mnapoli commented 3 years ago

@leocavalcante that is correct.

For the record I am not against supporting Swoole. I just won't implement it myself for now (unless sponsored by a company).

Swoole support could maybe be implemented separately from Bref via a raw HTTP handler (https://bref.sh/docs/function/handlers.html#api-gateway-http-events): the HTTP event could be forwarded to Swoole.

georgeboot commented 3 years ago

Laravel Vapor just added support for Swoole and they claim the speed improvements in real-world apps are indeed worth the effort.

AFAIK the main speed improvements come from:

Does this shine any new light on this issue?

mnapoli commented 3 years ago

@georgeboot yes! We've actually been discussing this with @deleugpn those last few weeks.

Here's what's happening on the Symfony bridge: https://github.com/brefphp/symfony-bridge/pull/45 We could apply exactly the same thing in the Laravel bridge and that would allow to do exactly that.

georgeboot commented 3 years ago

Awesome!

Am I correct in stating that re-using a booted kernel will basically be the same as what Swoole/Roadrunner will do, in the context of non-concurrent invocations like Lambda?

mnapoli commented 3 years ago

Yes (related Vapor announcement: https://twitter.com/laravelphp/status/1445019209017270272).

georgeboot commented 3 years ago

Yes oke. So from what I see in the issue you referred to, the plan isn't really to use Swoole/RoadRunner, but to just pre-boot the kernel and have that kernel handle multiple requests.

Since each lambda anyway processes one request at a time, there is no added benefit to using Swoole/RoadRunner.

Right? Please let me know how I can help getting this implemented for Laravel.

mnapoli commented 3 years ago

Since each lambda anyway processes one request at a time, there is no added benefit to using Swoole/RoadRunner.

One could choose to use Swoole to get coroutines or stuff like that. But the main benefit is about booting the app once I believe.

And regarding Laravel, the goal is to achieve something very similar to what is done in the Symfony pull request.

viezel commented 3 years ago

@mnapoli glad you picked it up again. Its very interesting what the vapor team did with this. I fully agree on the direction of the Symfony PR. That make sense for Bref.

FelipoAntonoff commented 2 years ago

The direction of this issue is very good :) Besides Swoole there is also Workerman and the Webman Web Framework that uses it. It has a performance in many cases equal or superior to Swoole and without depending on new extensions, easily installed by composer. Follow the Project Github: https://github.com/walkor/Workerman

The advantage of Swoole, Workerman and the like, in addition to what was already commented by colleagues about how to be able to make N asynchronous calls in the same function, is also in the matter of maintaining a similar project in Serverless and in Direct Cloud, taking advantage of the same code base made in Swoole or Workerman, thus taking advantage of their excellent performance in Cloud and Lambda keeping the same base structure.

It would also be good for Bref to have a focus on some PHP Micro Framework or a lighter Framework, but that would be another question, I have used and liked Flight PHP, but there is also Ubiquity, which even supports Swoole and Workerman and the famous Slim PHP, in addition to the aforementioned Webman that uses Workerman and made by the same author.

viezel commented 1 year ago

no need anymore with 2.0 released 🎉

leocavalcante commented 1 year ago

Hi, @viezel. I didn't get why it is not needed anymore since 2.0. Can you show me why, please?

georgeboot commented 1 year ago

Laravel users can now use Octane: https://bref.sh/docs/frameworks/laravel.html#laravel-octane

Non-Laravel users however can't use that and it's technically also not Swoole/RoadRunner.