Open jiachinzhao opened 2 years ago
I need this function to reduce the amount of return data when sending data. Sometimes, the return data is larger than the data being sent. The function should be similar to the one described here: https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-bulk_path
(check apply)
Is your feature request related to a problem? Please describe.
error response too large examples:
Could not bulk insert to Data Stream: nginx-main-ext-access-log-v1 {"took"=>4, "errors"=>true, "items"=>[{"create"=>{"_index"=>".ds-nginx-main-ext-access-log-v1-2022.05.16-000001", "_type"=>"_doc", "_id"=>"XwX54IABKbDZYVhbWSKH", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [upstream_status] of type [integer] in document with id 'XwX54IABKbDZYVhbWSKH'. Preview of field's value: '-'", "caused_by"=>{"type"=>"number_format_exception", "reason"=>"For input string: \"-\""}}}}, {"create"=>{"_index"=>".ds-nginx-main-ext-access-log-v1-2022.05.16-000001", "_type"=>"_doc", "_id"=>"YAX54IABKbDZYVhbWSKH", "_version"=>1, "result"=>"created", "_shards"=>{"total"=>2, "successful"=>2, "failed"=>0}, "_seq_no"=>56602, "_primary_term"=>1, "status"=>201}}]}
doc_id: XwX54IABKbDZYVhbWSKH has error doc_id: YAX54IABKbDZYVhbWSKH insert okwhen bulk size is large, error response is also too large. so response maybe just show infos which docs insert failed.
Describe the solution you'd like
from here, es bulk api, support filter_path to filter response which some docs are insert correct when bulk inserted failed
Describe alternatives you've considered
Additional context