elastic / logstash

Logstash - transport and process your logs, events, or other data
https://www.elastic.co/products/logstash
Other
67 stars 3.5k forks source link

Environment variables aren't interpolated in plugin ID field #6833

Open hartfordfive opened 7 years ago

hartfordfive commented 7 years ago

When loading the node pipeline stats via /_node/stats/pipeline, environment variables used within the ID field of the plugin don't seem to be interpolated. In my case, I use a env var ${LOGSTASH_TENANT} to indicate the tenant name, when I have multiple internal tenants (in a Docker use-case).

Version: 5.2.2 Operating System: Darwin

Sample Logstash Config: (indexing.conf)

input {
  kafka {
    id => "${LOGSTASH_TENANT}_indexing_input_kafka1"
    group_id => "cg-${LOGSTASH_TENANT}-indexing"
    decorate_events => true
    consumer_threads => "${LOGSTASH_INDEXER_KAFKA_CONSUMER_THREADS}"
    bootstrap_servers => "${LOGSTASH_KAFKA_BROKERS_LIST}"
    topics => [ "logstash-${LOGSTASH_TENANT}-indexing" ]
    codec => "json"
  }
}

filter {
  # ************* The following block deals with emtpy messages *************
  grok {
    id => "${LOGSTASH_TENANT}_indexing_filter_grok1"
    match => [
      "message", "^\s*$"
    ]
    add_tag => ["empty_message"]
    tag_on_failure => ["not_empty"]
  }

  if "empty_message" in [tags] {
    drop {}
  }

  if "not_empty" in [tags] {
    mutate {
      id => "${LOGSTASH_TENANT}_indexing_filter_mutate1"
      remove_tag => [ "not_empty" ]
    }
  }

  # ************* The following section deals with adding GeoIP data *************
  if [clientip] {
    mutate {
      id => "${LOGSTASH_TENANT}_indexing_filter_mutate2"
      add_field => {
        "_ip_field" => "%{clientip}"
      }
    }
  }
  if [client_ip] {
    mutate {
      id => "${LOGSTASH_TENANT}_indexing_filter_mutate3"
      add_field => {
        "_ip_field" => "%{client_ip}"
      }
    }
  }

  if [_ip_field] {
    geoip {
      id => "${LOGSTASH_TENANT}_indexing_filter_geoip1"
      source => "%{_ip_field}"
      target => "geoip"
      fields => ["city_name", "continent_code", "country_code2", "dma_code", "ip", "latitude", "longitude", "region_name", "region_code", "timezone", "location"]
      add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
      add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
      lru_cache_size => 5000
    }
    mutate {
      id => "${LOGSTASH_TENANT}_indexing_filter_mutate4"
      convert => [ "[geoip][coordinates]", "float" ]
    }
    mutate {
      id => "${LOGSTASH_TENANT}_indexing_filter_mutate5"
      remove_field => [ "_ip_field" ]
    }
  }

  de_dot {
    id => "${LOGSTASH_TENANT}_indexing_filter_dedot1"
    add_tag => ["de_dot"]
  }

  # ************* Finally, generate a UUID for this event, which will be used as value for _id *************
  uuid {
    id => "${LOGSTASH_TENANT}_indexing_filter_uuid1"
    #target => "@uuid"
    target => "[@metadata][uuid]"
  }
}

output {
  stdout { codec => rubydebug }
  elasticsearch {
    id => "${LOGSTASH_TENANT}_indexing_output_elasticsearch1"
    hosts => [ "${LOGSTASH_ES_HOST}" ]
    manage_template => "false"
    action => "update"
    doc_as_upsert => true
    document_id => "%{[@metadata][uuid]}"
    index => "%{team}-%{environment}-%{type}-%{+YYYY.MM.dd}"
  }
}

Sample Logstash Startup Command (for testing):

LOGSTASH_TENANT="team1" \ 
 LOGSTASH_ENVIRONMENT="development" \ 
 LOGSTASH_INDEXER_KAFKA_CONSUMER_THREADS=2  \  LOGSTASH_KAFKA_BROKERS_LIST="localhost:9092"  \ 
 LOGSTASH_ES_HOST="https://localhost:9200" \  
  ./bin/logstash -f indexing.conf --log.level=info --log.format=json

Sample Logstash Stats:

{
  "host": "dev.local",
  "version": "5.2.2",
  "http_address": "127.0.0.1:9600",
  "id": "cd23651e-09e6-4da9-ace5-3bcc34296809",
  "name": "dev.local",
  "pipeline": {
    "events": {
      "duration_in_millis": 0,
      "in": 0,
      "filtered": 0,
      "out": 0
    },
    "plugins": {
      "inputs": [],
      "filters": [
        {
          "id": "${LOGSTASH_TENANT}_indexing_filter_geoip1",
          "name": "geoip"
        },
        {
          "id": "2976b184398a75080278c36d732a58060d2bab9c-3",
          "name": "drop"
        },
        {
          "id": "2976b184398a75080278c36d732a58060d2bab9c-12",
          "name": "mutate"
        },
        {
          "id": "${LOGSTASH_TENANT}_indexing_filter_mutate1",
          "name": "mutate"
        },
        {
          "id": "${LOGSTASH_TENANT}_indexing_filter_mutate3",
          "name": "mutate"
        },
        {
          "id": "${LOGSTASH_TENANT}_indexing_filter_dedot1",
          "name": "de_dot"
        },
        {
          "id": "2976b184398a75080278c36d732a58060d2bab9c-14",
          "name": "date"
        },
        {
          "id": "${LOGSTASH_TENANT}_indexing_filter_mutate5",
          "name": "mutate"
        },
        {
          "id": "2976b184398a75080278c36d732a58060d2bab9c-13",
          "name": "date"
        },
        {
          "id": "2976b184398a75080278c36d732a58060d2bab9c-11",
          "name": "mutate"
        },
        {
          "id": "${LOGSTASH_TENANT}_indexing_filter_grok1",
          "patterns_per_field": {
            "message": 1
          },
          "name": "grok"
        },
        {
          "id": "${LOGSTASH_TENANT}_indexing_filter_uuid1",
          "name": "uuid"
        },
        {
          "id": "${LOGSTASH_TENANT}_indexing_filter_mutate4",
          "name": "mutate"
        },
        {
          "id": "${LOGSTASH_TENANT}_indexing_filter_mutate2",
          "name": "mutate"
        }
      ],
      "outputs": [
        {
          "id": "${LOGSTASH_TENANT}_indexing_output_elasticsearch1",
          "name": "elasticsearch"
        },
        {
          "id": "2976b184398a75080278c36d732a58060d2bab9c-16",
          "name": "stdout"
        }
      ]
    },
    "reloads": {
      "last_error": null,
      "successes": 0,
      "last_success_timestamp": null,
      "last_failure_timestamp": null,
      "failures": 0
    },
    "queue": {
      "type": "memory"
    }
  }
}
suyograo commented 7 years ago

Agreed this is a bug. The monitoring API should resolve the variables defined in the config.

robbavey commented 7 years ago

This looks like it might be related to #6696

original-brownbear commented 7 years ago

@robbavey looks like #6696 was just an error with the validation, see PR https://github.com/elastic/logstash/pull/7411, I think this is different right?

jordansissel commented 6 years ago

@robbavey is this resolved?

robbavey commented 6 years ago

@jordansissel It isn't - there was some refactoring that meant that the linked fix no longer works. I had a quick look, and it should be simple to rework to fix it up

derentis commented 6 years ago

Hi Elastic Team. Any update on this issue?

Edit: I'm using Logstash 6.3 and bug still seems to exist - can't seem to use any variable for this field, just sends through as %{[fieldname]}.

Edit2: Also tried %{fieldname}, same deal. The exact same variable is used for index and that works fine so I know it's accessible and correct.

mloeschner commented 6 years ago

Is there a workaround? I've tried to substitute credentials for basic authentication using environment variables in the http input plugin config.