i have 300+ filebeats installed on clients servers. can't use Logstash. can you guys introduce this feature. so we can get all the data in elastic via any source and then using ingest we can also update the doc if the same doc id is found.
Pipeline
PUT /_ingest/pipeline/upsert-doc-pipeline
{
"description": "Pipeline to upsert documents based on _id",
"processors": [
{
"fingerprint": {
"fields": [
"full_name",
"person_id"
],
"target_field": "_id",
"method": "SHA-1"
}
}
]
}
Description
i have 300+ filebeats installed on clients servers. can't use Logstash. can you guys introduce this feature. so we can get all the data in elastic via any source and then using ingest we can also update the doc if the same doc id is found.
Pipeline PUT /_ingest/pipeline/upsert-doc-pipeline { "description": "Pipeline to upsert documents based on _id", "processors": [ { "fingerprint": { "fields": [ "full_name", "person_id" ], "target_field": "_id", "method": "SHA-1" } } ] }
First time doc is created
POST /upsert-index/_doc?pipeline=upsert-doc-pipeline { "full_name": "John Doe", "person_id": "1234", "age": 31 }
It should update the doc
POST /upsert-index/_doc?pipeline=upsert-doc-pipeline { "full_name": "John Doe", "person_id": "1234", "age": 32 }