Closed toto4ds closed 1 year ago
What makes you think that's a memory leak? All the function is doing is making an API call and returning an integer:
I think there is an issue with hitting the S3 API a ton of times very quickly. Not sure, but it's best to try and avoid this and specify filesize up front on your own, when you have a lot of files to zip.
just a guess, my mistake.
in any case, if you use laravel
with league/flysystem-aws-s3-v3
which uses aws/aws-sdk-php
in turn, there is no problem. And it works very quickly and without errors.
Yes, this package uses aws/aws-sdk-php
as well, it's required for S3 file support. And the flyststem package makes the exact same 'HeadObject' API call.
I think I found what the problem is.
https://github.com/stechstudio/laravel-zipstream/blob/master/src/Models/S3File.php#L45-L52
$this->client
always empty and created every time. and it just takes a long time.
this is how it works very fast and without errors
$client = app('zipstream.s3client'); $zip->add( S3File::make('s3://bucket/key', 'zipfilename.txt')->setS3Client($client); );
I think that the client should be created in S3File.php
only once, or not?
That makes sense. We definitely don't want to create a new S3Client for every file.
The S3File model gets the client from the container, which is wired up here:
I would think this should simply change to $this->app->singleton
so it's only created once. If you're currently testing this, go ahead and make that change and see if it fixes the issue?
Yes, there is no error with $this->app->singleton
, but the $this->client
is still empty and $this->client = app('zipstream.s3client');
call everytime
Sure, it will retrieve the client from the container every time, but so long as the client is only built once, we're good. I want to stick with the container approach, since some apps re-bind that client in the container, creating their own custom client.
I just tagged 4.10 with this change, you should be all set. Thanks for your help with this!
looks like a mistake with the description, composer outdated -m
show build
stechstudio/laravel-zipstream 4.9 4.10 Build
and what about #87 I don't think it's necessary to refer to headers if the file size is set from outside. It kills the optimization.
mistake with the description
Huh? I don't know what you mean.
Feel free to add comments in #87
when i run the command composer outdated -m
spatie/laravel-ignition 1.6.3 1.6.4 A beautiful error page for Laravel applications.
stechstudio/laravel-zipstream 4.9 4.10 Build
symfony/css-selector v6.0.11 v6.0.17 Converts CSS selectors to XPath expressions
Why Build
, maybe it should be something like A fast and simple streaming zip file downloader for Laravel.
looks like this line needs to be fixed - https://github.com/stechstudio/laravel-zipstream/blob/master/composer.json#L3
Hi All, If I try to add more than 400 files (for me) via S3 I get an error:
Error executing "HeadObject" on "https://***/***" AWS HTTP error: cURL error 6: Could not resolve host: *** (see https://curl.haxx.se/libcurl/c/libcurl-errors.html) for https://***/***
and total large memory consumption. If replace the calculateFilesize function (do not refer to s3 but return the real size), then the zip is formed normally. HttpFile doesn't have that kind of problem. Apparently the garbage collector does not work or somewhere need to do unset();