laravel / octane

Supercharge your Laravel application's performance.
https://laravel.com/docs/octane
MIT License
3.78k stars 296 forks source link

Swoole: Doubled memory use of request input data #959

Open alecpl opened 1 month ago

alecpl commented 1 month ago

Octane Version

2.5.6

Laravel Version

10.48.22

PHP Version

8.1.29

What server type are you using?

Swoole

Server Version

5.1.4

Database Driver & Version

No response

Description

I noticed that octane+swoole consumes much more memory than standard artisan serve. For example I'm testing file uploads (using raw POST request) with files of different sizes. Results:

File size             | Laravel | Octane |
----------------------------------------------------------------
10 B                  |   8 MB  |  23 MB |
20 MB                 |  22 MB  |  64 MB |
50 MB                 |  53 MB  | 125 MB |

So, looks like with swoole the file is copied twice in memory.

Steps To Reproduce

Create a POST route that does nothing but logs/returns memory_get_peak_usage() and run curl -k --data-binary "@file" -H "Expect:" -H "Content-Type: text/plain" https://host/path/to/route. Make the file "./file" of appropriate size.

Swoole configuration from config/octane.php:

  'swoole' => [
        'options' => [
            'log_file' => storage_path('logs/swoole_http.log'),
            'package_max_length' => env('SWOOLE_PACKAGE_MAX_LENGTH', 50 * 1024 * 1024),
            'enable_coroutine' => false,
            'send_yield' => true,
            'socket_buffer_size' => 10 * 1024 * 1024,
        ],
    ],
NathanFreeman commented 1 month ago

Hello, your meaning is that after uploading the file, there is a memory leak?

alecpl commented 1 month ago

I don't think it's a leak, but it is an excessive memory usage. With Octane a request needs twice as much memory as for Laravel without Octane.

I can only guess that swoole server keeps the input in memory and Laravel Request keeps another copy of the input. It should not create a copy if that's possible.

EDIT: BTW, this is not peak usage, memory_get_usage() returns the same values, which means it's still allocated.

github-actions[bot] commented 1 month ago

Thank you for reporting this issue!

As Laravel is an open source project, we rely on the community to help us diagnose and fix issues as it is not possible to research and fix every issue reported to us via GitHub.

If possible, please make a pull request fixing the issue you have described, along with corresponding tests. All pull requests are promptly reviewed by the Laravel team.

Thank you!

NathanFreeman commented 1 month ago

Swoole will keeps a copy of the request data in memory, allowing you to get the raw http message through Swoole\Http\Request->getData().

NathanFreeman commented 1 month ago

You can try setting the upload_max_filesize configuration. As long as the content-length header of the http message is greater than package_max_length, swoole will write the uploaded file content directly to a temporary file instead of keeping it in memory.

  'swoole' => [
        'options' => [
            'package_max_length' => 1 * 1024 * 1024,
            'upload_max_filesize' => 5 * 1024 * 1024
        ],
    ],

Because upload_max_filesize is set to a value greater than 0, the client can upload files larger than package_max_length without encountering an error. When upload_max_filesize is not 0, Swoole will write the uploaded file, which means the request body, to disk instead of keeping it in memory. Of course, the request headers will still be transmitted in memory.

alecpl commented 1 month ago

Thank you. I confirm that what you suggested works with files uploaded using multipart/form-data method. It does not if the payload is POSTed in an unstructured request body. For this case package_max_length limits the input size and it is handled in memory.