Open bozho opened 2 years ago
Huh, this appears to be an Elasticsearch issue. When I change both timestamp field names to logtime
(i.e. both work nginx
access and error logs), I get the same ES parsing error for logtime
field.
It's as if ES automatically picks up date
string format from initial entries (nginx
access logs entries are much more frequent) and then uses that string format to parse all future date strings. Once there's an incoming nginx
error log entry with a differently formatted date/time in its logtime
field, ES complains.
(check apply)
Problem
Hi,
We are running a docker swarm and I'm working on implementing log shipping to ES+Kibana.
Fluentd
also runs as a service in the swarm. Individual docker services will be redirecting their logs tofluentd
, each service has its ownfluentd
tag.I will probably end up using
logstash_format
, but as an exercise, I'm trying to consolidate all timestamps from different logs into a single timestamp field to be used by Kibana as a time field on an index pattern.I'm currently focusing on
nginx
logs. I've configured access logs to be in JSON format:Nginx
error logs are not configurable, so we need to parse them using a regexp.Steps to replicate
This is our current configuration:
Expected Behavior or What you need to ask
The above configuration parses log timestamps in both logs fine and they are recognised as
date
types.nginx
$time_iso8601
variable is formatted like this:2022-06-14T07:30:53+00:00
. Error log timestamp is formatted like this:2022/06/14 07:41:19
.Then I realised I end up with two different log time fields:
timestamp
(parsed fromnginx
access log entries) andlogtime
(parsed fromnginx
error log entries). Kibana requires a single timestamp field to be selected for time-based index patterns.Trivial, I thought, I'll just rename regexp group
logtime
totimestamp
when parsingnginx
error log. The problem is that Elasticsearch then rejectsnginx
error log entries with:I can see
timestamp
field in the offending record, formatted as2022/06/14 07:41:20
. If I use any other field name (likelogtime
), the parsed data is the same, ES accepts entries, and the field shows up as adate
field in Kibana.Is
timestamp
a reserved/special field name in ES?nginx.access
entries withtimestamp
field are accepted by ES, with the format2022-06-14T07:30:53+00:00
.Is the
timestamp
issue even the plugin issue, or afluentd
issue?Thank you!
Using Fluentd and ES plugin versions
Docker image based off the official
fluent/fluentd:v1.14-1
Docker image (we replace ES libs with v7.7 and install this plugin).ES version:
5.4.1