Closed javeed90shaik closed 8 months ago
@javeed90shaik Did you ever debug this further? I am going to close this, but if this is still a problem we'd need to take a look at logs and errors. The plugin will queue data, so most likely it's a server-side problem not able to ingest it.
I was trying to ingest a CSV which is constructed programmatically, the plugin absolutely with Small data. Problem happens when ingesting large data i.e., greater than 1M. Essentially, the data will be lost after the run completes or the process gets killed by itself.
Expected behavior It should load all the records, it loads some data, loses most of it. Ex: I tried to load 2M records, it loaded count=404000
Properties Enabled:
pipeline.workers: 4 pipeline.batch.size: 4000 pipeline.batch.delay: 50 pipeline.unsafe_shutdown: true
Host/Environment (please complete the following information):
Note: The same document loads full 2M records when I tried it with Opensearch Logstash 7.16.3