Open stackimperfect opened 7 years ago
This seems like the wrong filter to do this. GeoIP filter exists to provide a way to get a geographic location for an IP address.
Getting a geographic location for a country name is a different use case and belongs in a different filter. I recommend creating a filter for this purpose. As an alternative, you could use the translate
filter and have a yaml file that maps country codes to geography.
Thank you for the answer, little bit of digging around, Say suppose i have a csv as follows.
I need a way to use the transalate filter in the logstash, which only has two values key and a value,
With the following how do i make this into a key value pair can i use another json object as a value? or has it both got to be a string string?
can i have json object like
"AND": { iso_alpha2:"AD", latitude:42.546245, longitude:1.601554 country_name:Andorra }
Would that be a proper way of working with this. i can arrange to prepare a JSON file for this.
what would be the sample for transalate? if i can get a JSON file, please help.
@stackimperfect translate filter can have anything for a value: Example lookup json:
{
"foo": {
"bar": "baz",
"fancy": 123
}
}
A lookup of foo
would produce { "bar": "baz", "fancy": 123 }
as a value.
Thinking of doing this. Json3LetterCountryCode.zip
filter
{
if [type] == "eventstats" {
grok {
remove_field => message
match => { message => "(?m)%{TIMESTAMP_ISO8601:sourceTimestamp} \[%{NUMBER:threadid}\] %{LOGLEVEL:loglevel} - %{WORD:envName}\|%{IPORHOST:actualHostMachine}\|%{WORD:applicationName}\|%{NUMBER:empId}\|%{WORD:regionCode}\|%{DATA:country}\|%{DATA:eventName}\|%{NUMBER:staffeventId}\|%{WORD:eventEvent}" }
}
if !("_grokparsefailure" in [tags]) {
translate {
field => "%{country}"
destination => "mapdata"
dictionary_path => '/opt/logstash/GeoIPDataFile/Json3LetterCountryCode.json'
}
mutate {
add_field => [ "[geoip][coordinates]", "%{[mapdata][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[mapdata][latitude]}" ]
remove_field => ["mapdata"]
add_tag => "eventstats"
add_tag => [ "eventFor_%{eventName}" ]
}
}
}
}
and output =
output
{
if [type] == eventstats" {
elasticsearch {
hosts => "localhost:9200"
index => eventstats-%{+YYYY-MM-dd}"
template => "/opt/logstash/templates/udp-eventstats.json"
template_name => eventstats"
template_overwrite => true
}
}
}
get a strange error "LogStash::Filters::Translate: can't convert Array into Hash when loading dictionary file at"
I have logstash plugin for transalte installed and is version "logstash-filter-translate-2.1.4"
@jordansissel according to the documentation for the translate filter
The JSON format only supports simple key/value, unnested objects
Though in my own testing this seems to work OK with nested dictionaries.
Enhancement please.
I understand that this is some thing which might be very usefull for people. A business use case here. A user has a profile and a country is assigned to user profile.
When the log is logging the informaiton it takes into account the profile and logs the country name, So all i now need is a way of saying I just pass my country name i need the co-ordinates for it.
I have a full detail on Stack overflow
http://stackoverflow.com/questions/40628896/reason-something-is-wrong-with-your-configuration-geoip-dat-mutate-logstash
If i can do that that means we should be able to show the stats per country. I could pass in a 3 digit or 2 digit ISO code if needed.