Our current Provider API Script for ingesting Wikimedia Commons data is causing an out of memory error from the API. The error message is:
PHP fatal error: Allowed memory size of 698351616 bytes exhausted (tried to allocate 1343488 bytes)
To Reproduce
Try to run the Wikimedia Commons Provider API Script with the parameter --date 2020-07-02
Screenshots
Additional context
Discussion with the folks on IRC at #wikimedia-tech has indicated that we should try turning down our LIMIT parameter (this defines how many records we request at a time), and turn up the parallelism (slowly; we don't want to be the people who took down Wikipedia for a day).
Bug Description
Our current Provider API Script for ingesting Wikimedia Commons data is causing an out of memory error from the API. The error message is:
To Reproduce
Try to run the Wikimedia Commons Provider API Script with the parameter
--date 2020-07-02
Screenshots
Additional context
Discussion with the folks on IRC at #wikimedia-tech has indicated that we should try turning down our
LIMIT
parameter (this defines how many records we request at a time), and turn up the parallelism (slowly; we don't want to be the people who took down Wikipedia for a day).