logstash-plugins / logstash-input-kinesis

Logstash Plugin for AWS Kinesis Input
Apache License 2.0
45 stars 57 forks source link

Multiple Kinesis Inputs Not Working #74

Closed tfendt closed 4 years ago

tfendt commented 4 years ago

I am trying to configure one logstash service to pull from multiple streams. All the streams are in the same account as the logstash instance. However, only the logs from the first input listed are passing through. When I view the logstash logs I don't see anything referencing it is pulling the logs from the other input.

Am I missing something in the config to get this working?

Here is what my config looks like:

input {
  kinesis {
    id => "s1-stream"
    kinesis_stream_name => "s1-stream"
    codec => cloudwatch_logs
    add_field => {"env" => "s1"}
  }
  kinesis {
    id => "s2-stream"
    kinesis_stream_name => "s2-stream"
    codec => cloudwatch_logs
    add_field => {"env" => "s2"}
  }
}

filter {
    grok {
        patterns_dir => "/etc/logstash/patterns/lambda-patterns"
        match => ["message", "%{TIMESTAMP_ISO8601:log_timestamp}\s+%{thread:thread_id}\s+\[%{WORD:level}\]\s+%{GREEDYDATA:message_content}" ]
    }

    if "_grokparsefailure" in [tags] {
        drop {}
    }

    if [level] == "DEBUG" {
        drop {}
    }
}

output {
    if [env] == "s1" {
        elasticsearch {
            hosts => "..."
            index => "logstash-s1-%{+YYYY.MM}"
        }
    }
    if [env] == "s2" {
        elasticsearch {
            hosts => "..."
            index => "logstash-s2-%{+YYYY.MM}"
        }
    }
tfendt commented 4 years ago

Figured out the issue. Was missing the application_name property. Both inputs were using the default "logstash" application name which was causing issues as they both looked to the same table. So the second input was using the first input's lease key on a completely separate stream.