Closed Gustutu closed 8 months ago
I am currently trying to index a large dataset ( more than 50k) but my container is killed on aws because it reach its memory limit.
I suspect "memory leak" at line 506 of models.py file.
where batch = [ ] could be replaced by batch.clear() to avoid it.
batch = [ ]
batch.clear()
Is it possible that it could be the cause of the issue ? What do you think ?
https://github.com/algolia/algoliasearch-django/blob/7f23fc21ce45146fd9b1a4d48a62bcc175695774/algoliasearch_django/models.py#L506C1-L507C1
I am currently trying to index a large dataset ( more than 50k) but my container is killed on aws because it reach its memory limit.
I suspect "memory leak" at line 506 of models.py file.
where
batch = [ ]
could be replaced bybatch.clear()
to avoid it.Is it possible that it could be the cause of the issue ? What do you think ?
https://github.com/algolia/algoliasearch-django/blob/7f23fc21ce45146fd9b1a4d48a62bcc175695774/algoliasearch_django/models.py#L506C1-L507C1