Closed SarindraTherese closed 2 years ago
I was able to solve the problem, step by step 1- create index
PUT /test-index
{
"settings": {
"number_of_replicas": 2,
"number_of_shards": 2
}
}
2- Add field timestamp
PUT _ingest/pipeline/add-current-time
{
"description" : "automatically add the current time to the documents",
"processors" : [
{
"set" : {
PUT _ingest/pipeline/add-current-time
{
"description" : "automatically add the current time to the documents",
"processors" : [
{
"set" : {
"field": "timestamp",
"value": "{{_ingest.timestamp}}"
}
}
]
} "field": "timestamp",
"value": "{{_ingest.timestamp}}"
}
}
]
}
3- Create data
PUT test-index/_doc/11?pipeline=add-current-time
{
"my_field": "test numero 11",
"girls group": "spice girl"
}
4- Output in kibana
"hits" : [
{
"_index" : "test-index",
"_type" : "_doc",
"_id" : "9",
"_score" : 1.0,
"_source" : {
"my_field" : "test numero 9",
"timestamp" : "2022-09-21T14:34:05.785318623Z"
}
}]
Thanks for the explanation!
I think the schema of the index must be created before that the connector is running. Otherwise some fields does not exist yet and elastic cannot sort on an unknown field.
Hey Dario! I finally managed to successfully run elasticsearch source connector witch apache kafka.
However, when I run the elasticsearch-source.properties as distributed:
I have some error like this:
and this:
All list topics are:
Normally in your tutorial all the indices products* are sent to Kafka using the es_ string as a topic prefix. So my question am I doing things right here? what's wrong? how can i know if all the data i pass to elasticsearch is read by kafka? thanks