uken / fluent-plugin-elasticsearch

Apache License 2.0
891 stars 309 forks source link

logstash_format=false is ignored #750

Closed dbaumgarten closed 4 years ago

dbaumgarten commented 4 years ago

(check apply)

Problem

Hello there,

I am using fluentd to ship logs from Kubernetes to elasticsearch. I am using this docker image to do this.

I want fluentd to write the logs into an index that has already been created by the administrators of the elasticseach server.

I have set index_name to the wanted index and set logstash_format to false (because according to the docs, otherwise index_name would be ignored).

But for some reason fluentd behaves as if logstash_format is set to true. It writes the logs to an index called logstash-\<date>.

I am really confused, why does fluentd (or the elasticsearch plugin) ignore the logstash_format=false setting?

Provide example config and message

Expected Behavior

Logs are written to the index called "caas-gks-dev"

Fluentd config (as printed at startup)

<match **>
    @type elasticsearch
    @id out_es
    @log_level "info"
    include_tag_key true
    host "3ef313bba04646089393a373a4922070.kibana.mlaas.prod.mls.projects.de-wob-3.cloud.vwgroup.com"
    port 9243
    path ""
    scheme https
    ssl_verify true
    ssl_version TLSv1_2
    user "caas_beat_writer"
    password xxxxxx
    reload_connections false
    reconnect_on_error true
    reload_on_failure true
    log_es_400_reason false
    logstash_prefix "logstash"
    logstash_dateformat "%Y.%m.%d"
    logstash_format false
    index_name "caas-gks-dev"
    type_name "fluentd"
    include_timestamp false
    template_name
    template_file
    template_overwrite false
    sniffer_class_name "Fluent::Plugin::ElasticsearchSimpleSniffer"
    request_timeout 5s
    <buffer>
      flush_thread_count 8
      flush_interval 5s
      chunk_limit_size 2M
      queue_limit_length 32
      retry_max_interval 30
      retry_forever true
    </buffer>
  </match>

...

Using Fluentd and ES plugin versions

dbaumgarten commented 4 years ago

I found the problem. If logstash_format is false and include_timestamp is false (the default) no timestamps are included in the log-data and the logs therefore do not show up in Kibana. Once setting include_timestamp to true everything works fine.