Closed lukaszgryglicki closed 4 years ago
Hi @lukaszgryglicki , sorry for the late reply!
The problem may be due to the presence of a large document that exceeds the max content of an HTTP request. You can try to modify your docker-compose to change the value of http.max_content_length
. You can find an example of it at:
https://github.com/crossminer/scava-deployment/blob/master/docker-compose.yml#L140
Hope this helps!
EDIT: When working on that project, I remember that we spend some time to find a good combination between the http.max_content_length
and bulk size (to avoid the error and not affect too much the time to import the documents to ES). In that case, it was: http.max_content_length=500mb and bulk size = 500
Great, thanks, we're using managed ES, but maybe it is possible to set that value there.
Hi @lukaszgryglicki , I'm closing this issue, feel free to reopen it if needed.
1000
, then100
finally10
- nothing helps.p2o.py --enrich --index jenkins-raw --index-enrich jenkins -e [...]--bulk-size 100 --scroll-size 100 --db-host [...] --db-sortinghat [...] --db-user [...]--db-password [...] jenkins https://build.opnfv.org/ci/ --no-archive