Closed anitakrueger closed 5 years ago
The error is coming back from your Elasticsearch output, not from the GeoIP filter.
Another user experienced this error and discovered that Elasticsearch was auto-detecting the field as a "date" because the first entry it received for a day looked like a date, and proposed explicitly adding the field to the index template in the Elasticsearch Output Plugin: https://github.com/logstash-plugins/logstash-output-elasticsearch/pull/788
@yaauie Thank you so much! Indeed...this seems to be exactly what's going on. Funny enough, in today's index, the field is of type text and I get no failures. But Kibana shows the field with conflicting types for the past few months. Now it makes sense why we didn't notice this on initial rollout. I will go ahead and define the mapping in our index template.
Thanks again and Merry Christmas!
I gathered from the maxmind website, that postal codes for the UK are being returned with the first 2-4 characters (https://dev.maxmind.com/geoip/geoip2/geoip2-city-country-csv-databases/). Unfortunately this results in all of our parsed entries to go to the dead letter queue, as the geoip plugin seems to deem them to be in an invalid format.
We parse our webserver logs and use the geoip filter for the clientip. All our webserver log entries end up in the dead letter queue, because the geoip plugin is throwing this error: