Closed shubhamchugh closed 1 year ago
@shubhamchugh in order to import that much data, it's very likely that you will need a couple of strategies employed.
Firstly, batching/chunking. This is in place in the package, but it depends which version you're using.
Secondly, even with batching available, you'll likely want to move the actual importing to an asynchronous process, i.e. to a job queue, so that it can be done on a completely separate process to the one responding to HTTP requests.
This hasn't been built into this package just yet (there's an open issue for it: #6), but it's being worked on.
Hi @simonhamp
I've got similar issue with 6MB CSV with 80K lines. It's imposible to import it
I've got execution time exceded or memory exhausted error I've tried to tweak the script with more memory and no time limit but nothing ... The data are not imported.
Can you give me an hand for this? We can sponsor this fix... Regards
https://downloads.majestic.com/majestic_million.csv unable to upload 1 million records using this package can anyone help me or guide me on how to import huge data with this package