KnpLabs / Gaufrette

PHP library that provides a filesystem abstraction layer − will be a feast for your files!
http://knplabs.github.io/Gaufrette
MIT License
2.47k stars 355 forks source link

Copying large files leads to memory exhaustion #377

Open SrgSteak opened 8 years ago

SrgSteak commented 8 years ago

Copying large files from one stream to another can lead to a fatal error.

Copying from one local to another local adapter works (tested with ~5GB file), copying form one local to aws_s3 doesn't. (PHP Fatal error: Allowed memory size of ... exhausted in vendor/knplabs/gaufrette/src/Gaufrette/Stream/InMemoryBuffer.php on line 90)

polantis commented 7 years ago

Hello,

I am facing the same performance issue. I cannot download big files from S3 using aws s3 adaptater. I am getting memory issue.

Furtheremore i am trying to add several bigs files in a zipArchive like that: $zipArchive->addFromString($relativePath, file_get_contents($filename));

It seems not the best solution. Do you have any ideas?

Thanks

akerouanton commented 7 years ago

We plan to fix such issues in upcoming months but currently we don't have any solution, sorry.

y4roc commented 7 years ago

@SrgSteak Did you use copy('gaufrette://filesystem/file.tar.gz','/tmp/file.tar.gz')? I had the same issue, but I didn't use copy(). With copy() my application works fine.

SrgSteak commented 7 years ago

@thhan that ticket is nearly 2 years old. We moved to flysystem a long time ago.