Open bato3 opened 4 years ago
I've opened an issue on the database repo: https://github.com/joomla-framework/database/issues/287
So the solution to this issue would be to stagger reading and writing the data from the table during export. Limit the number of rows to read, then write them to a file resource, read the next batch and write it again until done. This is a bigger refactoring and I'm unsure if our Filesystem package properly supports this.
Steps to reproduce the issue
Default php 7.4 configuration from XAMMP, 389597 rows in
#_finder_terms
(127.9 MB on disk)Actual result
System information (as much as possible)
Additional comments
I know, that I can reconfigure server, but I think we should find a solution that requires less memory.