EmicoEcommerce / Magento2TweakwiseExport-archived

Magento 2 module for Tweakwise export
Other
6 stars 16 forks source link

Max memory limit reached while exporting #45

Closed hostep closed 5 years ago

hostep commented 6 years ago

Issue Brief

Hi there

We are trying to use this module on the Magento Cloud environment. The staging & production environments have 16 GB of memory we can use, but the default php value memory_limit is set to 1 GB, even for php-cli calls.

When we try to run this export on a shop with > 40.000 products, we run into memory problems. See steps to reproduce below.

Would it be possible to try to optimize the export code so it uses less memory? Mabye export in batches of a fixed amount of products at the same time (and maybe make this batch size configurable?)

Environment

Steps to reproduce

  1. Have a Magento installation on the Magento Cloud environment where the default memory_limit is set to 1 GB for the php-cli
  2. Have 40661 products in the catalog, 2 store views, and 110 product attributes
  3. Run the export cronjob, I'm using n98/magerun2 for this so we can see the errors:
    vendor/bin/n98-magerun2 sys:cron:run emico_tweakwise_export

Actual result

Run Emico\TweakwiseExport\Cron\Export::execute PHP Fatal error:  Allowed memory size of 1073741824 bytes exhausted (tried to allocate 20480 bytes) in vendor/emico/tweakwise-export/src/Model/Write/Products/ExportEntity.php on line 267

Check https://getcomposer.org/doc/articles/troubleshooting.md#memory-limit-errors for more info on how to handle out of memory errors.

Expected result

Export runs fine when a decent memory_limit value is being used, maybe try to export in batches of a few 100 products at the time to avoid running into memory problems?

Thanks!

Fgruntjes commented 6 years ago

We lowered the batch size to 500 products, could you give it another try?

hostep commented 6 years ago

Thanks @Fgruntjes, I wasn't aware you were already using batches, sorry about that :)

I'll try to test this out next week somewhere.

Thanks for the feedback!

Fgruntjes commented 6 years ago

@hostep NP, in the first release we did extensive testing if the memory usage did not increase over batches however now we are a few releases later and it might that some memory leak slipped in there. So not 100% sure that this will work or that there is a real problem here.

Other solution will be to make the batch size configurable. Just let me know how it went and if it did not solve the problem well dig in a bit deeper.

hostep commented 6 years ago

@Fgruntjes: I finally found some time to test this.

Unfortunately the change didn't help in our case, we still run into the same problem with a memory_limit of 1G.

I even tried lowering the BATCH_SIZE to 10 and it still triggers the problem. So it looks like there might be a memory leak somewhere in the export.

edwinljacobs commented 5 years ago

Hello,

We released 1.3.0 yesterday with a fix for this issue.

hostep commented 5 years ago

Thanks!

Unfortunately I won't be able to test this, since I no longer work on the project where we had this issue.

edwinljacobs commented 5 years ago

For now I will close this issue If this is incorrect feel free to create a new issue.

With kind regards