flownative / flow-aws-s3

Amazon S3 adaptor for Neos and Flow
MIT License
18 stars 33 forks source link

Large amount of resources results in: failed to open stream: Too many open files #9

Closed langeland closed 3 years ago

langeland commented 8 years ago

When running resource:copy, I get:

./flow resource:copy --publish persistent tmpNewCollection
Copying resource objects from collection "persistent" to collection "tmpNewCollection" ...

  4087/15130 [=======>--------------------]  27%
Warning: include_once(/home/sites/s3test/flow/Packages/Framework/TYPO3.Flow/Classes/TYPO3/Flow/Resource/Storage/Exception.php): failed to open stream: Too many open files in /home/sites/s3test/flow/Packages/Framework/TYPO3.Flow/Classes/TYPO3/Flow/Core/ClassLoader.php line 224

  Type: TYPO3\Flow\Error\Exception
  Code: 1
  File: Packages/Framework/TYPO3.Flow/Classes/TYPO3/Flow/Error/ErrorHandler.php
  Line: 80

It is a possibility just to change the number of open files allowed, but i'm not sure that is the right thing to do.

Im running typo3/flow dev-master e60b50b

kdambekalns commented 7 years ago

@langeland Do you still see that? Or did you abandon the use of this package? ;) If you have up-to-date information, that'd be great. Otherwise we'll try to reproduce…

limchivy commented 6 years ago

@kdambekalns, @langeland It takes long to copy, so I use aws s3 cp instead of.

After that it also takes long to run resource:clean

FLOW_CONTEXT=Production ./flow resource:clean
Checking if resource data exists for all known resource objects ...

 16238/131369 [===>-------------------------] 12%
albe commented 3 years ago

Could be solved by making the getObjects[ByCollection]() methods return generators instead of arrays :) See the core FileSystemStorage for reference: https://github.com/neos/flow-development-collection/blob/7.0/Neos.Flow/Classes/ResourceManagement/Storage/FileSystemStorage.php#L136-L173