Closed molinto closed 4 years ago
COMMAND:
elasticsearch_loader --index-settings-file audit_mapping.json --index audits --http-auth elastic:rPASSWORD --type _doc csv audit.csv
MAPPING FILE (audit_mapping.json):
{
"settings" : {
},
"mappings": {
"properties": {
"id": {
"type": "keyword"
},
"module": {
"type": "keyword"
},
"action": {
"type": "keyword"
},
"occured_at": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
}
}
}
}
Seems to make this in the audit - index management in Kibana:
{
"mapping": {
"properties": {
"action": {
"type": "keyword"
},
"id": {
"type": "keyword"
},
"id;\"module\";\"action\";\"description\";\"level\";\"occured_at\";\"created_user\";\"createdAt\";\"updatedAt\"": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"module": {
"type": "keyword"
},
"occured_at": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
}
}
}
}
Hoping --keys would get around this, but doesn't seem to do anything. Not sure if that's normal
This works though :)
elasticsearch_loader --index-settings-file audit_mapping.json --index audits --http-auth elastic:PASSWORD --keys id,action,module,occured_at --type _doc csv audit.csv
Maybe not, it just populates the id in ElasticSearch, all other fields are empty :(
Any ideas please @moshe
Tried with a different dataset:
elasticsearch_loader --index signals --http-auth elastic:PASSWORD csv signals.csv
RESULT: Imports with id column & ts'iface;snr;bar columns
elasticsearch_loader --index signals --http-auth elastic:PASSWORD --keys ts,snr,iface,bars csv signals.csv
RESULTS: Imports only the id column
elasticsearch_loader --index-settings-file signals_mapping.json --index signals --http-auth elastic:PASSWORD csv signals.csv
RESULT: Imports with id column & ts'iface;snr;bar column & the other fields but are empty
signals.csv
"ts";"iface";"snr";"bars"
"2019-03-01 02:16:49";"wwan0";-62.0;5
"2019-03-01 02:25:22";"wwan0";-56.0;5
"2019-03-01 02:26:09";"wwan0";-62.0;5
"2019-03-01 02:36:46";"wwan0";-56.0;5
"2019-03-01 02:36:52";"wwan0";-62.0;5
"2019-03-01 02:37:02";"wwan0";-55.0;5
"2019-03-01 02:37:07";"wwan0";-61.0;5
"2019-03-01 02:44:52";"wwan0";-55.0;5
"2019-03-01 02:44:57";"wwan0";-61.0;5
"2019-03-01 02:46:47";"wwan0";-55.0;5
"2019-03-01 02:46:58";"wwan0";-61.0;5
signal_mapping.json:
{
"settings" : {
},
"mappings": {
"properties": {
"@timestamp": {
"type": "date"
},
"snr": {
"type": "float"
},
"iface": {
"type": "keyword"
},
"bars": {
"type": "byte"
},
"ts": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss"
}
}
}
}
I see that your data delimited by ;
, you need to add --delimiter ';'
Thanks @moshe.
Couldn't find that in the documentation, tried it & get a 'Error: no such option: --delimiter'
Can you paste the exact commad? --delimiter
should come AFTER the csv command
elasticsearch_loader --index ... csv --delimiter ';' cake.csv
This works!
elasticsearch_loader --index-settings-file signal_mapping.json --index signals --http-auth elastic:PASSWORD --keys ts,snr,iface,bars --type _doc csv --delimiter ';' signal.csv
Thank you for your help
Command:
MAPPING FILE (audit_mapping.json):
CSV File (audit.csv_:
Errors: