elastic / logstash

Logstash - transport and process your logs, events, or other data
https://www.elastic.co/products/logstash
Other
14.21k stars 3.5k forks source link

[7.6.0] Could not determine ID for output/elasticsearch #11698

Open samary opened 4 years ago

samary commented 4 years ago

Hi,

I'm facing the following error when I disable the java_execution and split my configuration into multiple files :

Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Could not determine ID for output/elasticsearch, source don't matched: [file]/etc/logstash/conf.d/tests/002_output.conf

It works fine when I enable the java_execution (see output below) but I need to disable it since I need to receive events in the correct order use the aggregate plugin. It also worked in previous versions (7.5.2 and 7.4.1).

My different tests show the problem it is not related to the content of the config itself (I also tested different outputs like stdout and reordering the files) but rather in the configuration loading mechanism itself: it always fails on the second file.

Thanks for your help.

Kind regards,

Samir


General information.

001_input.conf

input {
  beats {
    port => 5440
    ssl => true
    ssl_certificate => "/etc/logstash/conf.d/logstash.crt"
    ssl_key => "/etc/logstash/conf.d/logstash.key"
    include_codec_tag => false
  }
}

002_output.conf

output {
    elasticsearch {
      hosts => ["http://elasticsearch:9200"]
      user => "elastic"
      password => "elastic"
      sniffing => true
      index => "test"
   }
}

Here are the debug outputs of the 2 differents command (without and with java_execution) :

/usr/share/logstash/bin/logstash --path.settings /etc/logstash --path.config "/etc/logstash/conf.d/tests/*.conf" --log.level=debug --http.host 0.0.0.0 --pipeline.workers 1 --java-execution false
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-03-17T14:26:49,377][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2020-03-17T14:26:49,524][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x69503026 @directory="/usr/share/logstash/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2020-03-17T14:26:49,538][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2020-03-17T14:26:49,541][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x5c65001c @directory="/usr/share/logstash/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2020-03-17T14:26:49,966][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2020-03-17T14:26:49,970][DEBUG][logstash.runner          ] node.name: "fd5c71bfc769"
[2020-03-17T14:26:49,972][DEBUG][logstash.runner          ] *path.config: "/etc/logstash/conf.d/tests/*.conf"
[2020-03-17T14:26:49,974][DEBUG][logstash.runner          ] *path.data: "/var/lib/logstash" (default: "/usr/share/logstash/data")
[2020-03-17T14:26:49,976][DEBUG][logstash.runner          ] modules.cli: []
[2020-03-17T14:26:49,979][DEBUG][logstash.runner          ] modules: []
[2020-03-17T14:26:49,981][DEBUG][logstash.runner          ] modules_list: []
[2020-03-17T14:26:49,983][DEBUG][logstash.runner          ] modules_variable_list: []
[2020-03-17T14:26:49,988][DEBUG][logstash.runner          ] modules_setup: false
[2020-03-17T14:26:49,990][DEBUG][logstash.runner          ] config.test_and_exit: false
[2020-03-17T14:26:49,992][DEBUG][logstash.runner          ] config.reload.automatic: false
[2020-03-17T14:26:49,994][DEBUG][logstash.runner          ] config.reload.interval: 3000000000
[2020-03-17T14:26:49,996][DEBUG][logstash.runner          ] config.support_escapes: false
[2020-03-17T14:26:49,998][DEBUG][logstash.runner          ] config.field_reference.parser: "STRICT"
[2020-03-17T14:26:50,000][DEBUG][logstash.runner          ] metric.collect: true
[2020-03-17T14:26:50,001][DEBUG][logstash.runner          ] pipeline.id: "main"
[2020-03-17T14:26:50,003][DEBUG][logstash.runner          ] pipeline.system: false
[2020-03-17T14:26:50,006][DEBUG][logstash.runner          ] *pipeline.workers: 1 (default: 4)
[2020-03-17T14:26:50,010][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2020-03-17T14:26:50,011][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2020-03-17T14:26:50,013][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2020-03-17T14:26:50,015][DEBUG][logstash.runner          ] *pipeline.java_execution: false (default: true)
[2020-03-17T14:26:50,017][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2020-03-17T14:26:50,019][DEBUG][logstash.runner          ] pipeline.plugin_classloaders: false
[2020-03-17T14:26:50,021][DEBUG][logstash.runner          ] pipeline.separate_logs: false
[2020-03-17T14:26:50,028][DEBUG][logstash.runner          ] path.plugins: []
[2020-03-17T14:26:50,029][DEBUG][logstash.runner          ] config.debug: false
[2020-03-17T14:26:50,031][DEBUG][logstash.runner          ] *log.level: "debug" (default: "info")
[2020-03-17T14:26:50,033][DEBUG][logstash.runner          ] version: false
[2020-03-17T14:26:50,035][DEBUG][logstash.runner          ] help: false
[2020-03-17T14:26:50,037][DEBUG][logstash.runner          ] log.format: "plain"
[2020-03-17T14:26:50,039][DEBUG][logstash.runner          ] *http.host: "0.0.0.0" (default: "127.0.0.1")
[2020-03-17T14:26:50,043][DEBUG][logstash.runner          ] http.port: 9600..9700
[2020-03-17T14:26:50,046][DEBUG][logstash.runner          ] http.environment: "production"
[2020-03-17T14:26:50,048][DEBUG][logstash.runner          ] queue.type: "memory"
[2020-03-17T14:26:50,050][DEBUG][logstash.runner          ] queue.drain: false
[2020-03-17T14:26:50,051][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2020-03-17T14:26:50,052][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2020-03-17T14:26:50,054][DEBUG][logstash.runner          ] queue.max_events: 0
[2020-03-17T14:26:50,055][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2020-03-17T14:26:50,056][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2020-03-17T14:26:50,057][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2020-03-17T14:26:50,059][DEBUG][logstash.runner          ] queue.checkpoint.retry: false
[2020-03-17T14:26:50,060][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2020-03-17T14:26:50,061][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2020-03-17T14:26:50,063][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
[2020-03-17T14:26:50,064][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
[2020-03-17T14:26:50,065][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
[2020-03-17T14:26:50,067][DEBUG][logstash.runner          ] slowlog.threshold.trace: -1
[2020-03-17T14:26:50,068][DEBUG][logstash.runner          ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2020-03-17T14:26:50,069][DEBUG][logstash.runner          ] *keystore.file: "/etc/logstash/logstash.keystore" (default: "/usr/share/logstash/config/logstash.keystore")
[2020-03-17T14:26:50,071][DEBUG][logstash.runner          ] *path.queue: "/var/lib/logstash/queue" (default: "/usr/share/logstash/data/queue")
[2020-03-17T14:26:50,072][DEBUG][logstash.runner          ] *path.dead_letter_queue: "/var/lib/logstash/dead_letter_queue" (default: "/usr/share/logstash/data/dead_letter_queue")
[2020-03-17T14:26:50,073][DEBUG][logstash.runner          ] *path.settings: "/etc/logstash" (default: "/usr/share/logstash/config")
[2020-03-17T14:26:50,074][DEBUG][logstash.runner          ] *path.logs: "/var/log/logstash" (default: "/usr/share/logstash/logs")
[2020-03-17T14:26:50,076][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2020-03-17T14:26:50,081][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: 5000000000
[2020-03-17T14:26:50,082][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2020-03-17T14:26:50,084][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
[2020-03-17T14:26:50,086][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2020-03-17T14:26:50,087][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
[2020-03-17T14:26:50,089][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2020-03-17T14:26:50,091][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2020-03-17T14:26:50,092][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2020-03-17T14:26:50,094][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: 10000000000
[2020-03-17T14:26:50,095][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: 600000000000
[2020-03-17T14:26:50,096][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2020-03-17T14:26:50,098][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2020-03-17T14:26:50,099][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2020-03-17T14:26:50,103][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2020-03-17T14:26:50,106][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2020-03-17T14:26:50,107][DEBUG][logstash.runner          ] node.uuid: ""
[2020-03-17T14:26:50,109][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2020-03-17T14:26:50,161][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-03-17T14:26:50,257][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.0"}
[2020-03-17T14:26:50,322][DEBUG][logstash.agent           ] Setting up metric collection
[2020-03-17T14:26:50,419][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-03-17T14:26:50,481][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuacctresource] File /sys/fs/cgroup/cpuacct/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpuacct.usage cannot be found, try providing an override 'ls.cgroup.cpuacct.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:26:50,490][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.cfs_period_us cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:26:50,494][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.cfs_quota_us cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:26:50,500][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.stat cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:26:50,720][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-03-17T14:26:50,858][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2020-03-17T14:26:50,865][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2020-03-17T14:26:50,904][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-03-17T14:26:50,922][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-03-17T14:26:51,019][DEBUG][logstash.agent           ] Starting agent
[2020-03-17T14:26:51,159][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>[]}
[2020-03-17T14:26:51,164][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/etc/logstash/conf.d/tests/001_input.conf"}
[2020-03-17T14:26:51,177][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/etc/logstash/conf.d/tests/002_output.conf"}
[2020-03-17T14:26:51,232][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2020-03-17T14:26:51,243][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2020-03-17T14:26:52,826][DEBUG][org.reflections.Reflections] going to scan these urls:
jar:file:/usr/share/logstash/logstash-core/lib/jars/logstash-core.jar!/
[2020-03-17T14:26:52,897][INFO ][org.reflections.Reflections] Reflections took 66 ms to scan 1 urls, producing 20 keys and 40 values
[2020-03-17T14:26:52,913][DEBUG][org.reflections.Reflections] expanded subtype co.elastic.logstash.api.Plugin -> co.elastic.logstash.api.Codec
[2020-03-17T14:26:52,914][DEBUG][org.reflections.Reflections] expanded subtype co.elastic.logstash.api.Plugin -> co.elastic.logstash.api.Input
[2020-03-17T14:26:52,916][DEBUG][org.reflections.Reflections] expanded subtype org.jruby.RubyBasicObject -> org.jruby.RubyObject
[2020-03-17T14:26:52,918][DEBUG][org.reflections.Reflections] expanded subtype java.lang.Cloneable -> org.jruby.RubyBasicObject
[2020-03-17T14:26:52,919][DEBUG][org.reflections.Reflections] expanded subtype org.jruby.runtime.builtin.IRubyObject -> org.jruby.RubyBasicObject
[2020-03-17T14:26:52,920][DEBUG][org.reflections.Reflections] expanded subtype java.io.Serializable -> org.jruby.RubyBasicObject
[2020-03-17T14:26:52,921][DEBUG][org.reflections.Reflections] expanded subtype java.lang.Comparable -> org.jruby.RubyBasicObject
[2020-03-17T14:26:52,924][DEBUG][org.reflections.Reflections] expanded subtype org.jruby.runtime.marshal.CoreObjectType -> org.jruby.RubyBasicObject
[2020-03-17T14:26:52,925][DEBUG][org.reflections.Reflections] expanded subtype org.jruby.runtime.builtin.InstanceVariables -> org.jruby.RubyBasicObject
[2020-03-17T14:26:52,926][DEBUG][org.reflections.Reflections] expanded subtype org.jruby.runtime.builtin.InternalVariables -> org.jruby.RubyBasicObject
[2020-03-17T14:26:52,928][DEBUG][org.reflections.Reflections] expanded subtype co.elastic.logstash.api.Plugin -> co.elastic.logstash.api.Output
[2020-03-17T14:26:52,930][DEBUG][org.reflections.Reflections] expanded subtype co.elastic.logstash.api.Metric -> co.elastic.logstash.api.NamespacedMetric
[2020-03-17T14:26:52,932][DEBUG][org.reflections.Reflections] expanded subtype java.security.SecureClassLoader -> java.net.URLClassLoader
[2020-03-17T14:26:52,933][DEBUG][org.reflections.Reflections] expanded subtype java.lang.ClassLoader -> java.security.SecureClassLoader
[2020-03-17T14:26:53,039][DEBUG][org.reflections.Reflections] expanded subtype java.io.Closeable -> java.net.URLClassLoader
[2020-03-17T14:26:53,041][DEBUG][org.reflections.Reflections] expanded subtype java.lang.AutoCloseable -> java.io.Closeable
[2020-03-17T14:26:53,043][DEBUG][org.reflections.Reflections] expanded subtype java.lang.Comparable -> java.lang.Enum
[2020-03-17T14:26:53,044][DEBUG][org.reflections.Reflections] expanded subtype java.io.Serializable -> java.lang.Enum
[2020-03-17T14:26:53,047][DEBUG][org.reflections.Reflections] expanded subtype co.elastic.logstash.api.Plugin -> co.elastic.logstash.api.Filter
[2020-03-17T14:26:53,253][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"beats", :type=>"input", :class=>LogStash::Inputs::Beats}
[2020-03-17T14:26:53,426][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2020-03-17T14:26:53,460][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_76525626-2161-484e-ba83-929d4ef91609"
[2020-03-17T14:26:53,462][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2020-03-17T14:26:53,466][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2020-03-17T14:26:53,500][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_certificate = "/etc/logstash/conf.d/logstash.crt"
[2020-03-17T14:26:53,502][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@include_codec_tag = false
[2020-03-17T14:26:53,503][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@id = "98d74a215b941489d03f0e1f0c326991b2c101a7875f3d422521399d0a92368f"
[2020-03-17T14:26:53,504][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_key = "/etc/logstash/conf.d/logstash.key"
[2020-03-17T14:26:53,512][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@port = 5440
[2020-03-17T14:26:53,513][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl = true
[2020-03-17T14:26:53,517][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@enable_metric = true
[2020-03-17T14:26:53,532][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@codec = <LogStash::Codecs::Plain id=>"plain_76525626-2161-484e-ba83-929d4ef91609", enable_metric=>true, charset=>"UTF-8">
[2020-03-17T14:26:53,540][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@add_field = {}
[2020-03-17T14:26:53,543][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@host = "0.0.0.0"
[2020-03-17T14:26:53,544][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_certificate_authorities = []
[2020-03-17T14:26:53,546][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@add_hostname = false
[2020-03-17T14:26:53,547][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_verify_mode = "none"
[2020-03-17T14:26:53,548][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_peer_metadata = false
[2020-03-17T14:26:53,549][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_handshake_timeout = 10000
[2020-03-17T14:26:53,550][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@tls_min_version = 1
[2020-03-17T14:26:53,553][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@tls_max_version = 1.2
[2020-03-17T14:26:53,554][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@cipher_suites = ["TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256"]
[2020-03-17T14:26:53,557][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@client_inactivity_timeout = 60
[2020-03-17T14:26:53,558][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@executor_threads = 4
[2020-03-17T14:26:53,593][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch}
[2020-03-17T14:26:53,600][ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Could not determine ID for output/elasticsearch, source don't matched: [file]/etc/logstash/conf.d/tests/002_output.conf:3:5:```\noutput {\n    elasticsearch {\n      hosts => [\"http://elasticsearch:9200\"]\n      user => \"elastic\"\n      password => \"elastic\"\n      sniffing => true\n      index => \"test\"\n   }\n}\n\n```", :backtrace=>["org/logstash/plugins/PluginFactoryExt.java:193:in `plugin'", "org/logstash/plugins/PluginFactoryExt.java:167:in `plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:77:in `plugin'", "(eval):12:in `initialize'", "org/jruby/RubyKernel.java:1052:in `eval'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:51:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:96:in `initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:36:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:326:in `block in converge_state'"]}
[2020-03-17T14:26:53,673][DEBUG][logstash.agent           ] Starting puma
[2020-03-17T14:26:53,689][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2020-03-17T14:26:53,709][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2020-03-17T14:26:53,712][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2020-03-17T14:26:53,716][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2020-03-17T14:26:53,718][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2020-03-17T14:26:53,754][DEBUG][logstash.agent           ] Shutting down all pipelines {:pipelines_count=>0}
[2020-03-17T14:26:53,773][DEBUG][logstash.api.service     ] [api-service] start
[2020-03-17T14:26:53,802][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>0}
[2020-03-17T14:26:53,977][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-03-17T14:26:58,872][INFO ][logstash.runner          ] Logstash shut down.
/usr/share/logstash/bin/logstash --path.settings /etc/logstash --path.config "/etc/logstash/conf.d/tests/*.conf" --log.level=debug --http.host 0.0.0.0 --pipeline.workers 1 --java-execution true
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2020-03-17T14:28:03,602][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2020-03-17T14:28:03,949][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x59e7a6d8 @directory="/usr/share/logstash/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2020-03-17T14:28:03,965][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2020-03-17T14:28:03,994][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x1d6c69ff @directory="/usr/share/logstash/modules/fb_apache/configuration", @module_name="fb_apache", @kibana_version_parts=["6", "0", "0"]>}
[2020-03-17T14:28:04,678][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2020-03-17T14:28:04,683][DEBUG][logstash.runner          ] node.name: "fd5c71bfc769"
[2020-03-17T14:28:04,692][DEBUG][logstash.runner          ] *path.config: "/etc/logstash/conf.d/tests/*.conf"
[2020-03-17T14:28:04,696][DEBUG][logstash.runner          ] *path.data: "/var/lib/logstash" (default: "/usr/share/logstash/data")
[2020-03-17T14:28:04,703][DEBUG][logstash.runner          ] modules.cli: []
[2020-03-17T14:28:04,706][DEBUG][logstash.runner          ] modules: []
[2020-03-17T14:28:04,709][DEBUG][logstash.runner          ] modules_list: []
[2020-03-17T14:28:04,716][DEBUG][logstash.runner          ] modules_variable_list: []
[2020-03-17T14:28:04,723][DEBUG][logstash.runner          ] modules_setup: false
[2020-03-17T14:28:04,730][DEBUG][logstash.runner          ] config.test_and_exit: false
[2020-03-17T14:28:04,733][DEBUG][logstash.runner          ] config.reload.automatic: false
[2020-03-17T14:28:04,734][DEBUG][logstash.runner          ] config.reload.interval: 3000000000
[2020-03-17T14:28:04,740][DEBUG][logstash.runner          ] config.support_escapes: false
[2020-03-17T14:28:04,743][DEBUG][logstash.runner          ] config.field_reference.parser: "STRICT"
[2020-03-17T14:28:04,746][DEBUG][logstash.runner          ] metric.collect: true
[2020-03-17T14:28:04,751][DEBUG][logstash.runner          ] pipeline.id: "main"
[2020-03-17T14:28:04,752][DEBUG][logstash.runner          ] pipeline.system: false
[2020-03-17T14:28:04,755][DEBUG][logstash.runner          ] *pipeline.workers: 1 (default: 4)
[2020-03-17T14:28:04,758][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2020-03-17T14:28:04,760][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2020-03-17T14:28:04,762][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2020-03-17T14:28:04,764][DEBUG][logstash.runner          ] pipeline.java_execution: true
[2020-03-17T14:28:04,769][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2020-03-17T14:28:04,770][DEBUG][logstash.runner          ] pipeline.plugin_classloaders: false
[2020-03-17T14:28:04,772][DEBUG][logstash.runner          ] pipeline.separate_logs: false
[2020-03-17T14:28:04,775][DEBUG][logstash.runner          ] path.plugins: []
[2020-03-17T14:28:04,776][DEBUG][logstash.runner          ] config.debug: false
[2020-03-17T14:28:04,779][DEBUG][logstash.runner          ] *log.level: "debug" (default: "info")
[2020-03-17T14:28:04,782][DEBUG][logstash.runner          ] version: false
[2020-03-17T14:28:04,785][DEBUG][logstash.runner          ] help: false
[2020-03-17T14:28:04,788][DEBUG][logstash.runner          ] log.format: "plain"
[2020-03-17T14:28:04,790][DEBUG][logstash.runner          ] *http.host: "0.0.0.0" (default: "127.0.0.1")
[2020-03-17T14:28:04,792][DEBUG][logstash.runner          ] http.port: 9600..9700
[2020-03-17T14:28:04,793][DEBUG][logstash.runner          ] http.environment: "production"
[2020-03-17T14:28:04,799][DEBUG][logstash.runner          ] queue.type: "memory"
[2020-03-17T14:28:04,802][DEBUG][logstash.runner          ] queue.drain: false
[2020-03-17T14:28:04,803][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2020-03-17T14:28:04,805][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2020-03-17T14:28:04,806][DEBUG][logstash.runner          ] queue.max_events: 0
[2020-03-17T14:28:04,808][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2020-03-17T14:28:04,809][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2020-03-17T14:28:04,810][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2020-03-17T14:28:04,812][DEBUG][logstash.runner          ] queue.checkpoint.retry: false
[2020-03-17T14:28:04,813][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2020-03-17T14:28:04,814][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2020-03-17T14:28:04,816][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
[2020-03-17T14:28:04,817][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
[2020-03-17T14:28:04,819][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
[2020-03-17T14:28:04,820][DEBUG][logstash.runner          ] slowlog.threshold.trace: -1
[2020-03-17T14:28:04,821][DEBUG][logstash.runner          ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2020-03-17T14:28:04,822][DEBUG][logstash.runner          ] *keystore.file: "/etc/logstash/logstash.keystore" (default: "/usr/share/logstash/config/logstash.keystore")
[2020-03-17T14:28:04,824][DEBUG][logstash.runner          ] *path.queue: "/var/lib/logstash/queue" (default: "/usr/share/logstash/data/queue")
[2020-03-17T14:28:04,825][DEBUG][logstash.runner          ] *path.dead_letter_queue: "/var/lib/logstash/dead_letter_queue" (default: "/usr/share/logstash/data/dead_letter_queue")
[2020-03-17T14:28:04,826][DEBUG][logstash.runner          ] *path.settings: "/etc/logstash" (default: "/usr/share/logstash/config")
[2020-03-17T14:28:04,828][DEBUG][logstash.runner          ] *path.logs: "/var/log/logstash" (default: "/usr/share/logstash/logs")
[2020-03-17T14:28:04,829][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2020-03-17T14:28:04,833][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: 5000000000
[2020-03-17T14:28:04,837][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2020-03-17T14:28:04,838][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
[2020-03-17T14:28:04,843][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2020-03-17T14:28:04,845][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
[2020-03-17T14:28:04,846][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2020-03-17T14:28:04,847][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2020-03-17T14:28:04,849][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2020-03-17T14:28:04,850][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: 10000000000
[2020-03-17T14:28:04,851][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: 600000000000
[2020-03-17T14:28:04,853][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2020-03-17T14:28:04,854][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2020-03-17T14:28:04,857][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2020-03-17T14:28:04,858][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2020-03-17T14:28:04,860][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2020-03-17T14:28:04,862][DEBUG][logstash.runner          ] node.uuid: ""
[2020-03-17T14:28:04,863][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2020-03-17T14:28:04,937][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2020-03-17T14:28:04,950][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.0"}
[2020-03-17T14:28:05,146][DEBUG][logstash.agent           ] Setting up metric collection
[2020-03-17T14:28:05,240][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-03-17T14:28:05,303][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuacctresource] File /sys/fs/cgroup/cpuacct/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpuacct.usage cannot be found, try providing an override 'ls.cgroup.cpuacct.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:05,325][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.cfs_period_us cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:05,331][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.cfs_quota_us cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:05,344][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.stat cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:05,605][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-03-17T14:28:05,781][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2020-03-17T14:28:05,793][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2020-03-17T14:28:05,844][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-03-17T14:28:05,859][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2020-03-17T14:28:06,002][DEBUG][logstash.agent           ] Starting agent
[2020-03-17T14:28:06,160][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>[]}
[2020-03-17T14:28:06,182][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/etc/logstash/conf.d/tests/001_input.conf"}
[2020-03-17T14:28:06,215][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/etc/logstash/conf.d/tests/002_output.conf"}
[2020-03-17T14:28:06,284][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2020-03-17T14:28:06,301][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2020-03-17T14:28:07,367][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2020-03-17T14:28:08,015][DEBUG][org.reflections.Reflections] going to scan these urls:
jar:file:/usr/share/logstash/logstash-core/lib/jars/logstash-core.jar!/
[2020-03-17T14:28:08,071][INFO ][org.reflections.Reflections] Reflections took 53 ms to scan 1 urls, producing 20 keys and 40 values
[2020-03-17T14:28:08,083][DEBUG][org.reflections.Reflections] expanded subtype co.elastic.logstash.api.Plugin -> co.elastic.logstash.api.Codec
[2020-03-17T14:28:08,084][DEBUG][org.reflections.Reflections] expanded subtype co.elastic.logstash.api.Plugin -> co.elastic.logstash.api.Input
[2020-03-17T14:28:08,085][DEBUG][org.reflections.Reflections] expanded subtype org.jruby.RubyBasicObject -> org.jruby.RubyObject
[2020-03-17T14:28:08,086][DEBUG][org.reflections.Reflections] expanded subtype java.lang.Cloneable -> org.jruby.RubyBasicObject
[2020-03-17T14:28:08,087][DEBUG][org.reflections.Reflections] expanded subtype org.jruby.runtime.builtin.IRubyObject -> org.jruby.RubyBasicObject
[2020-03-17T14:28:08,088][DEBUG][org.reflections.Reflections] expanded subtype java.io.Serializable -> org.jruby.RubyBasicObject
[2020-03-17T14:28:08,089][DEBUG][org.reflections.Reflections] expanded subtype java.lang.Comparable -> org.jruby.RubyBasicObject
[2020-03-17T14:28:08,090][DEBUG][org.reflections.Reflections] expanded subtype org.jruby.runtime.marshal.CoreObjectType -> org.jruby.RubyBasicObject
[2020-03-17T14:28:08,091][DEBUG][org.reflections.Reflections] expanded subtype org.jruby.runtime.builtin.InstanceVariables -> org.jruby.RubyBasicObject
[2020-03-17T14:28:08,092][DEBUG][org.reflections.Reflections] expanded subtype org.jruby.runtime.builtin.InternalVariables -> org.jruby.RubyBasicObject
[2020-03-17T14:28:08,093][DEBUG][org.reflections.Reflections] expanded subtype co.elastic.logstash.api.Plugin -> co.elastic.logstash.api.Output
[2020-03-17T14:28:08,095][DEBUG][org.reflections.Reflections] expanded subtype co.elastic.logstash.api.Metric -> co.elastic.logstash.api.NamespacedMetric
[2020-03-17T14:28:08,096][DEBUG][org.reflections.Reflections] expanded subtype java.security.SecureClassLoader -> java.net.URLClassLoader
[2020-03-17T14:28:08,097][DEBUG][org.reflections.Reflections] expanded subtype java.lang.ClassLoader -> java.security.SecureClassLoader
[2020-03-17T14:28:08,098][DEBUG][org.reflections.Reflections] expanded subtype java.io.Closeable -> java.net.URLClassLoader
[2020-03-17T14:28:08,099][DEBUG][org.reflections.Reflections] expanded subtype java.lang.AutoCloseable -> java.io.Closeable
[2020-03-17T14:28:08,102][DEBUG][org.reflections.Reflections] expanded subtype java.lang.Comparable -> java.lang.Enum
[2020-03-17T14:28:08,103][DEBUG][org.reflections.Reflections] expanded subtype java.io.Serializable -> java.lang.Enum
[2020-03-17T14:28:08,105][DEBUG][org.reflections.Reflections] expanded subtype co.elastic.logstash.api.Plugin -> co.elastic.logstash.api.Filter
[2020-03-17T14:28:08,230][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"beats", :type=>"input", :class=>LogStash::Inputs::Beats}
[2020-03-17T14:28:08,399][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2020-03-17T14:28:08,424][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_301b6dce-f684-40a4-82df-38e2372f601b"
[2020-03-17T14:28:08,426][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2020-03-17T14:28:08,433][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2020-03-17T14:28:08,452][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_certificate = "/etc/logstash/conf.d/logstash.crt"
[2020-03-17T14:28:08,454][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@include_codec_tag = false
[2020-03-17T14:28:08,455][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@id = "98d74a215b941489d03f0e1f0c326991b2c101a7875f3d422521399d0a92368f"
[2020-03-17T14:28:08,456][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_key = "/etc/logstash/conf.d/logstash.key"
[2020-03-17T14:28:08,457][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@port = 5440
[2020-03-17T14:28:08,459][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl = true
[2020-03-17T14:28:08,461][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@enable_metric = true
[2020-03-17T14:28:08,477][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@codec = <LogStash::Codecs::Plain id=>"plain_301b6dce-f684-40a4-82df-38e2372f601b", enable_metric=>true, charset=>"UTF-8">
[2020-03-17T14:28:08,479][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@add_field = {}
[2020-03-17T14:28:08,480][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@host = "0.0.0.0"
[2020-03-17T14:28:08,485][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_certificate_authorities = []
[2020-03-17T14:28:08,488][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@add_hostname = false
[2020-03-17T14:28:08,489][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_verify_mode = "none"
[2020-03-17T14:28:08,491][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_peer_metadata = false
[2020-03-17T14:28:08,492][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@ssl_handshake_timeout = 10000
[2020-03-17T14:28:08,493][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@tls_min_version = 1
[2020-03-17T14:28:08,495][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@tls_max_version = 1.2
[2020-03-17T14:28:08,497][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@cipher_suites = ["TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256"]
[2020-03-17T14:28:08,498][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@client_inactivity_timeout = 60
[2020-03-17T14:28:08,499][DEBUG][logstash.inputs.beats    ] config LogStash::Inputs::Beats/@executor_threads = 4
[2020-03-17T14:28:08,543][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch}
[2020-03-17T14:28:08,588][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_c95b5dc8-83ba-4fdb-80e0-5519dfe36d44"
[2020-03-17T14:28:08,590][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2020-03-17T14:28:08,591][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2020-03-17T14:28:08,604][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "test"
[2020-03-17T14:28:08,609][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2020-03-17T14:28:08,611][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = true
[2020-03-17T14:28:08,612][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "8c20d45474fc2c66bc2c0bf4c259110d2a21a9b5bcba7fbb9864e82342eddeef"
[2020-03-17T14:28:08,613][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "elastic"
[2020-03-17T14:28:08,630][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [http://elasticsearch:9200]
[2020-03-17T14:28:08,632][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2020-03-17T14:28:08,634][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_c95b5dc8-83ba-4fdb-80e0-5519dfe36d44", enable_metric=>true, charset=>"UTF-8">
[2020-03-17T14:28:08,635][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2020-03-17T14:28:08,636][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2020-03-17T14:28:08,637][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2020-03-17T14:28:08,638][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2020-03-17T14:28:08,640][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2020-03-17T14:28:08,641][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2020-03-17T14:28:08,642][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2020-03-17T14:28:08,643][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2020-03-17T14:28:08,644][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2020-03-17T14:28:08,646][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2020-03-17T14:28:08,647][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2020-03-17T14:28:08,648][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2020-03-17T14:28:08,649][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2020-03-17T14:28:08,650][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2020-03-17T14:28:08,651][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2020-03-17T14:28:08,656][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2020-03-17T14:28:08,657][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2020-03-17T14:28:08,658][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_enabled = "auto"
[2020-03-17T14:28:08,659][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_rollover_alias = "logstash"
[2020-03-17T14:28:08,660][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_pattern = "{now/d}-000001"
[2020-03-17T14:28:08,661][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_policy = "logstash-policy"
[2020-03-17T14:28:08,662][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2020-03-17T14:28:08,663][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2020-03-17T14:28:08,664][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2020-03-17T14:28:08,665][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2020-03-17T14:28:08,665][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2020-03-17T14:28:08,666][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2020-03-17T14:28:08,667][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2020-03-17T14:28:08,668][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2020-03-17T14:28:08,669][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2020-03-17T14:28:08,669][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2020-03-17T14:28:08,670][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@custom_headers = {}
[2020-03-17T14:28:08,720][DEBUG][logstash.javapipeline    ] Starting pipeline {:pipeline_id=>"main"}
[2020-03-17T14:28:08,791][DEBUG][logstash.outputs.elasticsearch][main] Normalizing http path {:path=>nil, :normalized=>nil}
[2020-03-17T14:28:09,332][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@elasticsearch:9200/]}}
[2020-03-17T14:28:09,353][DEBUG][logstash.outputs.elasticsearch][main] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elastic:xxxxxx@elasticsearch:9200/, :path=>"/"}
[2020-03-17T14:28:09,784][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@elasticsearch:9200/"}
[2020-03-17T14:28:09,857][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-03-17T14:28:09,863][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-03-17T14:28:09,943][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://elasticsearch:9200"]}
[2020-03-17T14:28:10,104][INFO ][logstash.outputs.elasticsearch][main] Using default mapping template
[2020-03-17T14:28:10,171][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-03-17T14:28:10,177][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>125, "pipeline.sources"=>["/etc/logstash/conf.d/tests/001_input.conf", "/etc/logstash/conf.d/tests/002_output.conf"], :thread=>"#<Thread:0x7884e6f1 run>"}
[2020-03-17T14:28:10,222][INFO ][logstash.outputs.elasticsearch][main] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2020-03-17T14:28:10,265][DEBUG][logstash.outputs.elasticsearch][main] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2020-03-17T14:28:10,606][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuacctresource] File /sys/fs/cgroup/cpuacct/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpuacct.usage cannot be found, try providing an override 'ls.cgroup.cpuacct.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:10,616][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.cfs_period_us cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:10,623][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.cfs_quota_us cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:10,625][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.stat cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:10,920][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2020-03-17T14:28:10,924][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2020-03-17T14:28:11,054][DEBUG][org.logstash.config.ir.CompiledPipeline][main] Compiled output
 P[output-elasticsearch{"hosts"=>["http://elasticsearch:9200"], "user"=>"elastic", "password"=>"elastic", "sniffing"=>"true", "index"=>"test"}|[file]/etc/logstash/conf.d/tests/002_output.conf:2:5:```
elasticsearch {
      hosts => ["http://elasticsearch:9200"]
      user => "elastic"
      password => "elastic"
      sniffing => true
      index => "test"
   }
```]
 into
 org.logstash.config.ir.compiler.ComputeStepSyntaxElement@c99a0cea
[2020-03-17T14:28:11,259][DEBUG][io.netty.util.internal.logging.InternalLoggerFactory][main] Using SLF4J as the default logging framework
[2020-03-17T14:28:11,289][DEBUG][io.netty.util.internal.PlatformDependent0][main] -Dio.netty.noUnsafe: false
[2020-03-17T14:28:11,290][DEBUG][io.netty.util.internal.PlatformDependent0][main] Java version: 8
[2020-03-17T14:28:11,292][DEBUG][io.netty.util.internal.PlatformDependent0][main] sun.misc.Unsafe.theUnsafe: available
[2020-03-17T14:28:11,293][DEBUG][io.netty.util.internal.PlatformDependent0][main] sun.misc.Unsafe.copyMemory: available
[2020-03-17T14:28:11,295][DEBUG][io.netty.util.internal.PlatformDependent0][main] java.nio.Buffer.address: available
[2020-03-17T14:28:11,296][DEBUG][io.netty.util.internal.PlatformDependent0][main] direct buffer constructor: available
[2020-03-17T14:28:11,298][DEBUG][io.netty.util.internal.PlatformDependent0][main] java.nio.Bits.unaligned: available, true
[2020-03-17T14:28:11,298][DEBUG][io.netty.util.internal.PlatformDependent0][main] jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable prior to Java9
[2020-03-17T14:28:11,299][DEBUG][io.netty.util.internal.PlatformDependent0][main] java.nio.DirectByteBuffer.<init>(long, int): available
[2020-03-17T14:28:11,300][DEBUG][io.netty.util.internal.PlatformDependent][main] sun.misc.Unsafe: available
[2020-03-17T14:28:11,301][DEBUG][io.netty.util.internal.PlatformDependent][main] -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
[2020-03-17T14:28:11,302][DEBUG][io.netty.util.internal.PlatformDependent][main] -Dio.netty.bitMode: 64 (sun.arch.data.model)
[2020-03-17T14:28:11,303][DEBUG][io.netty.util.internal.PlatformDependent][main] -Dio.netty.maxDirectMemory: 1038876672 bytes
[2020-03-17T14:28:11,304][DEBUG][io.netty.util.internal.PlatformDependent][main] -Dio.netty.uninitializedArrayAllocationThreshold: -1
[2020-03-17T14:28:11,306][DEBUG][io.netty.util.internal.CleanerJava6][main] java.nio.ByteBuffer.cleaner(): available
[2020-03-17T14:28:11,307][DEBUG][io.netty.util.internal.PlatformDependent][main] -Dio.netty.noPreferDirect: false
[2020-03-17T14:28:11,310][DEBUG][io.netty.util.internal.NativeLibraryLoader][main] -Dio.netty.native.workdir: /tmp (io.netty.tmpdir)
[2020-03-17T14:28:11,311][DEBUG][io.netty.util.internal.NativeLibraryLoader][main] -Dio.netty.native.deleteLibAfterLoading: true
[2020-03-17T14:28:11,312][DEBUG][io.netty.util.internal.NativeLibraryLoader][main] -Dio.netty.native.tryPatchShadedId: true
[2020-03-17T14:28:11,316][DEBUG][io.netty.util.internal.NativeLibraryLoader][main] Unable to load the library 'netty_tcnative_linux_x86_64', trying other loading mechanism.
[2020-03-17T14:28:11,317][DEBUG][io.netty.util.internal.NativeLibraryLoader][main] netty_tcnative_linux_x86_64 cannot be loaded from java.libary.path, now trying export to -Dio.netty.native.workdir: /tmp
[2020-03-17T14:28:11,340][DEBUG][io.netty.util.internal.NativeLibraryLoader][main] Successfully loaded the library /tmp/libnetty_tcnative_linux_x86_649039138783622204075.so
[2020-03-17T14:28:11,341][DEBUG][io.netty.handler.ssl.OpenSsl][main] Initialize netty-tcnative using engine: 'default'
[2020-03-17T14:28:11,342][DEBUG][io.netty.handler.ssl.OpenSsl][main] netty-tcnative using native library: BoringSSL
[2020-03-17T14:28:11,455][DEBUG][io.netty.util.ResourceLeakDetector][main] -Dio.netty.leakDetection.level: simple
[2020-03-17T14:28:11,457][DEBUG][io.netty.util.ResourceLeakDetector][main] -Dio.netty.leakDetection.targetRecords: 4
[2020-03-17T14:28:11,469][DEBUG][io.netty.buffer.AbstractByteBuf][main] -Dio.netty.buffer.checkAccessible: true
[2020-03-17T14:28:11,471][DEBUG][io.netty.buffer.AbstractByteBuf][main] -Dio.netty.buffer.checkBounds: true
[2020-03-17T14:28:11,473][DEBUG][io.netty.util.ResourceLeakDetectorFactory][main] Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@7f9e106d
[2020-03-17T14:28:11,487][DEBUG][io.netty.util.internal.InternalThreadLocalMap][main] -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
[2020-03-17T14:28:11,488][DEBUG][io.netty.util.internal.InternalThreadLocalMap][main] -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
[2020-03-17T14:28:11,492][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.numHeapArenas: 8
[2020-03-17T14:28:11,493][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.numDirectArenas: 8
[2020-03-17T14:28:11,493][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.pageSize: 8192
[2020-03-17T14:28:11,494][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.maxOrder: 11
[2020-03-17T14:28:11,495][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.chunkSize: 16777216
[2020-03-17T14:28:11,496][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.tinyCacheSize: 512
[2020-03-17T14:28:11,496][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.smallCacheSize: 256
[2020-03-17T14:28:11,499][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.normalCacheSize: 64
[2020-03-17T14:28:11,499][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.maxCachedBufferCapacity: 32768
[2020-03-17T14:28:11,500][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.cacheTrimInterval: 8192
[2020-03-17T14:28:11,506][DEBUG][io.netty.buffer.PooledByteBufAllocator][main] -Dio.netty.allocator.useCacheForAllThreads: true
[2020-03-17T14:28:11,526][DEBUG][io.netty.buffer.ByteBufUtil][main] -Dio.netty.allocator.type: pooled
[2020-03-17T14:28:11,528][DEBUG][io.netty.buffer.ByteBufUtil][main] -Dio.netty.threadLocalDirectBufferSize: 0
[2020-03-17T14:28:11,529][DEBUG][io.netty.buffer.ByteBufUtil][main] -Dio.netty.maxThreadLocalCharBufferSize: 16384
[2020-03-17T14:28:11,553][DEBUG][io.netty.util.ResourceLeakDetectorFactory][main] Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@5a2752dc
[2020-03-17T14:28:11,555][DEBUG][io.netty.handler.ssl.ReferenceCountedOpenSslContext][main] ReferenceCountedOpenSslContext supports -Djdk.tls.ephemeralDHKeySize={int}, but got: matched
[2020-03-17T14:28:11,566][DEBUG][io.netty.util.Recycler   ][main] -Dio.netty.recycler.maxCapacityPerThread: 4096
[2020-03-17T14:28:11,567][DEBUG][io.netty.util.Recycler   ][main] -Dio.netty.recycler.maxSharedCapacityFactor: 2
[2020-03-17T14:28:11,569][DEBUG][io.netty.util.Recycler   ][main] -Dio.netty.recycler.linkCapacity: 16
[2020-03-17T14:28:11,572][DEBUG][io.netty.util.Recycler   ][main] -Dio.netty.recycler.ratio: 8
[2020-03-17T14:28:11,590][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 => ECDHE-ECDSA-AES128-GCM-SHA256
[2020-03-17T14:28:11,591][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 => ECDHE-ECDSA-AES128-GCM-SHA256
[2020-03-17T14:28:11,592][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 => ECDHE-RSA-AES128-GCM-SHA256
[2020-03-17T14:28:11,592][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_128_GCM_SHA256 => ECDHE-RSA-AES128-GCM-SHA256
[2020-03-17T14:28:11,593][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384 => ECDHE-ECDSA-AES256-GCM-SHA384
[2020-03-17T14:28:11,595][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384 => ECDHE-ECDSA-AES256-GCM-SHA384
[2020-03-17T14:28:11,596][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384 => ECDHE-RSA-AES256-GCM-SHA384
[2020-03-17T14:28:11,597][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_256_GCM_SHA384 => ECDHE-RSA-AES256-GCM-SHA384
[2020-03-17T14:28:11,598][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-ECDSA-CHACHA20-POLY1305
[2020-03-17T14:28:11,599][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-ECDSA-CHACHA20-POLY1305
[2020-03-17T14:28:11,600][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-RSA-CHACHA20-POLY1305
[2020-03-17T14:28:11,601][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-RSA-CHACHA20-POLY1305
[2020-03-17T14:28:11,602][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_PSK_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-PSK-CHACHA20-POLY1305
[2020-03-17T14:28:11,603][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_PSK_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-PSK-CHACHA20-POLY1305
[2020-03-17T14:28:11,604][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA => ECDHE-ECDSA-AES128-SHA
[2020-03-17T14:28:11,605][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_128_CBC_SHA => ECDHE-ECDSA-AES128-SHA
[2020-03-17T14:28:11,605][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256 => ECDHE-ECDSA-AES128-SHA256
[2020-03-17T14:28:11,606][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256 => ECDHE-ECDSA-AES128-SHA256
[2020-03-17T14:28:11,607][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA => ECDHE-RSA-AES128-SHA
[2020-03-17T14:28:11,608][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_128_CBC_SHA => ECDHE-RSA-AES128-SHA
[2020-03-17T14:28:11,609][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256 => ECDHE-RSA-AES128-SHA256
[2020-03-17T14:28:11,610][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_128_CBC_SHA256 => ECDHE-RSA-AES128-SHA256
[2020-03-17T14:28:11,611][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_PSK_WITH_AES_128_CBC_SHA => ECDHE-PSK-AES128-CBC-SHA
[2020-03-17T14:28:11,613][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_PSK_WITH_AES_128_CBC_SHA => ECDHE-PSK-AES128-CBC-SHA
[2020-03-17T14:28:11,614][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA => ECDHE-ECDSA-AES256-SHA
[2020-03-17T14:28:11,615][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_256_CBC_SHA => ECDHE-ECDSA-AES256-SHA
[2020-03-17T14:28:11,616][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 => ECDHE-ECDSA-AES256-SHA384
[2020-03-17T14:28:11,617][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 => ECDHE-ECDSA-AES256-SHA384
[2020-03-17T14:28:11,618][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA => ECDHE-RSA-AES256-SHA
[2020-03-17T14:28:11,618][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_256_CBC_SHA => ECDHE-RSA-AES256-SHA
[2020-03-17T14:28:11,620][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384 => ECDHE-RSA-AES256-SHA384
[2020-03-17T14:28:11,621][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_256_CBC_SHA384 => ECDHE-RSA-AES256-SHA384
[2020-03-17T14:28:11,622][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_ECDHE_PSK_WITH_AES_256_CBC_SHA => ECDHE-PSK-AES256-CBC-SHA
[2020-03-17T14:28:11,623][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_ECDHE_PSK_WITH_AES_256_CBC_SHA => ECDHE-PSK-AES256-CBC-SHA
[2020-03-17T14:28:11,623][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_RSA_WITH_AES_128_GCM_SHA256 => AES128-GCM-SHA256
[2020-03-17T14:28:11,624][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_RSA_WITH_AES_128_GCM_SHA256 => AES128-GCM-SHA256
[2020-03-17T14:28:11,625][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_RSA_WITH_AES_256_GCM_SHA384 => AES256-GCM-SHA384
[2020-03-17T14:28:11,626][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_RSA_WITH_AES_256_GCM_SHA384 => AES256-GCM-SHA384
[2020-03-17T14:28:11,627][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_RSA_WITH_AES_128_CBC_SHA => AES128-SHA
[2020-03-17T14:28:11,627][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_RSA_WITH_AES_128_CBC_SHA => AES128-SHA
[2020-03-17T14:28:11,628][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_RSA_WITH_AES_128_CBC_SHA256 => AES128-SHA256
[2020-03-17T14:28:11,629][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_RSA_WITH_AES_128_CBC_SHA256 => AES128-SHA256
[2020-03-17T14:28:11,630][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_PSK_WITH_AES_128_CBC_SHA => PSK-AES128-CBC-SHA
[2020-03-17T14:28:11,631][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_PSK_WITH_AES_128_CBC_SHA => PSK-AES128-CBC-SHA
[2020-03-17T14:28:11,632][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_RSA_WITH_AES_256_CBC_SHA => AES256-SHA
[2020-03-17T14:28:11,633][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_RSA_WITH_AES_256_CBC_SHA => AES256-SHA
[2020-03-17T14:28:11,633][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_RSA_WITH_AES_256_CBC_SHA256 => AES256-SHA256
[2020-03-17T14:28:11,634][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_RSA_WITH_AES_256_CBC_SHA256 => AES256-SHA256
[2020-03-17T14:28:11,635][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_PSK_WITH_AES_256_CBC_SHA => PSK-AES256-CBC-SHA
[2020-03-17T14:28:11,636][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_PSK_WITH_AES_256_CBC_SHA => PSK-AES256-CBC-SHA
[2020-03-17T14:28:11,637][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: TLS_RSA_WITH_3DES_EDE_CBC_SHA => DES-CBC3-SHA
[2020-03-17T14:28:11,638][DEBUG][io.netty.handler.ssl.CipherSuiteConverter][main] Cipher suite mapping: SSL_RSA_WITH_3DES_EDE_CBC_SHA => DES-CBC3-SHA
[2020-03-17T14:28:11,639][DEBUG][io.netty.handler.ssl.OpenSsl][main] Supported protocols (OpenSSL): [SSLv2Hello, SSLv3, TLSv1, TLSv1.1, TLSv1.2]
[2020-03-17T14:28:11,640][DEBUG][io.netty.handler.ssl.OpenSsl][main] Default cipher suites (OpenSSL): [TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA, TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA]
[2020-03-17T14:28:11,656][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5440"}
[2020-03-17T14:28:11,672][DEBUG][org.logstash.netty.SslSimpleBuilder][main] Cipher is supported: TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384
[2020-03-17T14:28:11,674][DEBUG][org.logstash.netty.SslSimpleBuilder][main] Cipher is supported: TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
[2020-03-17T14:28:11,675][DEBUG][org.logstash.netty.SslSimpleBuilder][main] Cipher is supported: TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256
[2020-03-17T14:28:11,676][DEBUG][org.logstash.netty.SslSimpleBuilder][main] Cipher is supported: TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256
[2020-03-17T14:28:11,677][DEBUG][org.logstash.netty.SslSimpleBuilder][main] Cipher is supported: TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384
[2020-03-17T14:28:11,678][DEBUG][org.logstash.netty.SslSimpleBuilder][main] Cipher is supported: TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384
[2020-03-17T14:28:11,681][DEBUG][org.logstash.netty.SslSimpleBuilder][main] Cipher is supported: TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256
[2020-03-17T14:28:11,681][DEBUG][org.logstash.netty.SslSimpleBuilder][main] Cipher is supported: TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256
[2020-03-17T14:28:11,693][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-03-17T14:28:11,706][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2020-03-17T14:28:11,707][DEBUG][logstash.javapipeline    ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x7884e6f1 run>"}
[2020-03-17T14:28:11,745][DEBUG][io.netty.channel.MultithreadEventLoopGroup][main] -Dio.netty.eventLoopThreads: 8
[2020-03-17T14:28:11,788][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-03-17T14:28:11,800][DEBUG][io.netty.channel.nio.NioEventLoop][main] -Dio.netty.noKeySetOptimization: false
[2020-03-17T14:28:11,801][DEBUG][io.netty.channel.nio.NioEventLoop][main] -Dio.netty.selectorAutoRebuildThreshold: 512
[2020-03-17T14:28:11,810][DEBUG][io.netty.util.internal.PlatformDependent][main] org.jctools-core.MpscChunkedArrayQueue: available
[2020-03-17T14:28:11,819][INFO ][org.logstash.beats.Server][main] Starting server on port: 5440
[2020-03-17T14:28:11,851][DEBUG][io.netty.channel.DefaultChannelId][main] -Dio.netty.processId: 203 (auto-detected)
[2020-03-17T14:28:11,857][DEBUG][io.netty.util.NetUtil    ][main] -Djava.net.preferIPv4Stack: false
[2020-03-17T14:28:11,858][DEBUG][io.netty.util.NetUtil    ][main] -Djava.net.preferIPv6Addresses: false
[2020-03-17T14:28:11,862][DEBUG][io.netty.util.NetUtil    ][main] Loopback interface: lo (lo, 127.0.0.1)
[2020-03-17T14:28:11,864][DEBUG][io.netty.util.NetUtil    ][main] /proc/sys/net/core/somaxconn: 128
[2020-03-17T14:28:11,866][DEBUG][logstash.agent           ] Starting puma
[2020-03-17T14:28:11,880][DEBUG][io.netty.channel.DefaultChannelId][main] -Dio.netty.machineId: 02:42:ac:ff:fe:15:00:0e (auto-detected)
[2020-03-17T14:28:11,917][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2020-03-17T14:28:11,966][DEBUG][logstash.api.service     ] [api-service] start
[2020-03-17T14:28:12,159][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2020-03-17T14:28:14,958][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[http://elastic:xxxxxx@elasticsearch:9200/], :added=>[http://elastic:xxxxxx@10.0.0.95:9200/]}}
[2020-03-17T14:28:14,969][DEBUG][logstash.outputs.elasticsearch][main] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elastic:xxxxxx@10.0.0.95:9200/, :path=>"/"}
[2020-03-17T14:28:14,993][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@10.0.0.95:9200/"}
[2020-03-17T14:28:15,675][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuacctresource] File /sys/fs/cgroup/cpuacct/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpuacct.usage cannot be found, try providing an override 'ls.cgroup.cpuacct.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:15,677][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.cfs_period_us cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:15,678][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.cfs_quota_us cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:15,679][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.stat cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:15,941][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2020-03-17T14:28:15,948][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2020-03-17T14:28:16,706][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2020-03-17T14:28:20,686][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuacctresource] File /sys/fs/cgroup/cpuacct/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpuacct.usage cannot be found, try providing an override 'ls.cgroup.cpuacct.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:20,688][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.cfs_period_us cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:20,690][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.cfs_quota_us cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:20,691][DEBUG][logstash.instrument.periodicpoller.cgroup.cpuresource] File /sys/fs/cgroup/cpu/docker/fd5c71bfc769b2373b90ac586d7e11ce0437dd2a29348c121633f4291d35b2ed/cpu.stat cannot be found, try providing an override 'ls.cgroup.cpu.path.override' in the Logstash JAVA_OPTS environment variable
[2020-03-17T14:28:20,956][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2020-03-17T14:28:20,957][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2020-03-17T14:28:21,679][DEBUG][org.logstash.netty.SslSimpleBuilder][main] Available ciphers:[ECDHE-ECDSA-AES128-GCM-SHA256, ECDHE-RSA-AES128-GCM-SHA256, ECDHE-ECDSA-AES256-GCM-SHA384, ECDHE-RSA-AES256-GCM-SHA384, ECDHE-ECDSA-CHACHA20-POLY1305, ECDHE-RSA-CHACHA20-POLY1305, ECDHE-PSK-CHACHA20-POLY1305, ECDHE-ECDSA-AES128-SHA, ECDHE-ECDSA-AES128-SHA256, ECDHE-RSA-AES128-SHA, ECDHE-RSA-AES128-SHA256, ECDHE-PSK-AES128-CBC-SHA, ECDHE-ECDSA-AES256-SHA, ECDHE-ECDSA-AES256-SHA384, ECDHE-RSA-AES256-SHA, ECDHE-RSA-AES256-SHA384, ECDHE-PSK-AES256-CBC-SHA, AES128-GCM-SHA256, AES256-GCM-SHA384, AES128-SHA, AES128-SHA256, PSK-AES128-CBC-SHA, AES256-SHA, AES256-SHA256, PSK-AES256-CBC-SHA, DES-CBC3-SHA]
[2020-03-17T14:28:21,686][DEBUG][org.logstash.netty.SslSimpleBuilder][main] Ciphers:  [TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256, TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256]
[2020-03-17T14:28:21,706][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2020-03-17T14:28:21,718][DEBUG][io.netty.util.ResourceLeakDetectorFactory][main] Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@68fd6630
[2020-03-17T14:28:21,773][DEBUG][io.netty.util.ResourceLeakDetectorFactory][main] Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@20e891ac
[2020-03-17T14:28:21,779][DEBUG][org.logstash.netty.SslSimpleBuilder][main] TLS: [TLSv1, TLSv1.1, TLSv1.2]
[2020-03-17T14:28:21,818][DEBUG][logstash.codecs.plain    ][main] config LogStash::Codecs::Plain/@id = "plain_301b6dce-f684-40a4-82df-38e2372f601b"
[2020-03-17T14:28:21,825][DEBUG][logstash.codecs.plain    ][main] config LogStash::Codecs::Plain/@enable_metric = true
[2020-03-17T14:28:21,826][DEBUG][logstash.codecs.plain    ][main] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2020-03-17T14:28:21,914][DEBUG][io.netty.handler.ssl.SslHandler][main] [id: 0xf3038847, L:/10.255.1.136:5440 - R:/10.255.0.2:58378] HANDSHAKEN: TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384
[2020-03-17T14:28:21,938][DEBUG][org.logstash.beats.ConnectionHandler][main] f3038847: batches pending: true
[2020-03-17T14:28:21,940][DEBUG][org.logstash.beats.ConnectionHandler][main] f3038847: batches pending: true
[2020-03-17T14:28:21,942][DEBUG][org.logstash.beats.ConnectionHandler][main] f3038847: batches pending: true
[2020-03-17T14:28:21,944][DEBUG][org.logstash.beats.ConnectionHandler][main] f3038847: batches pending: true
[2020-03-17T14:28:21,948][DEBUG][org.logstash.beats.ConnectionHandler][main] f3038847: batches pending: true
[2020-03-17T14:28:21,952][DEBUG][org.logstash.beats.ConnectionHandler][main] f3038847: batches pending: true
[2020-03-17T14:28:21,953][DEBUG][org.logstash.beats.ConnectionHandler][main] f3038847: batches pending: true
[2020-03-17T14:28:21,965][DEBUG][org.logstash.beats.BeatsHandler][main] [local: 10.255.1.136:5440, remote: 10.255.0.2:58378] Received a new payload
[2020-03-17T14:28:21,967][DEBUG][org.logstash.beats.BeatsHandler][main] [local: 10.255.1.136:5440, remote: 10.255.0.2:58378] Sending a new message for the listener, sequence: 1
...
duylong commented 4 years ago

Same issue since upgrade to 7.6.2. No problem with 7.5.2...

Bug not related to Elasticsearch. I have also an error with filter/mutate for example.

mvenukadasula commented 4 years ago

Any update on this ticket? Is it resolved or still happening? I have the same problem.

samary commented 4 years ago

I don't have any proper solution. I worked around this issue by developping in multiple files and combine them on deployment. I will test the version 7.7 with the pipeline ordering mechanism. I hope it will solve my issue.

mvenukadasula commented 4 years ago

Looks like issue is resolved with "java_execution" in 7.7.0. No need to enable the flag in this version as it is expected to work with java. I will post my results soon.

duylong commented 4 years ago

Same problem with 7.7.1 when I set pipeline.java_execution to false

samary commented 4 years ago

I tested 7.7.0 with --pipeline.workers 1 --pipeline.ordered true and keeping java_execution enabled and it works without any error now.

I'm closing this issue.

duylong commented 4 years ago

The initial problem is "Disable the java_execution and split pipeline configuration in multiple files", I don't understand why the ticket is closed and the java_execution enabled option.

mvenukadasula commented 4 years ago

Agree with @duylong if Logstash not starting with "java_execution: false", ticket should be still open.

@duylong what are you trying achieve by disabling java execution? For me event ordering is the problem and hoping that is resolved in 7.7.0 with java execution enabled (default setting). Testing that fix right now.

duylong commented 4 years ago

I have a complex configuration in several files, java execution enabled does not work for me, the loading is long and endless. Everything worked fine since version 7.5.2, now I have this error and I can no longer upgrade.

samary commented 4 years ago

@duylong : Indeed, the root cause is not solved. But for my use case, the only requirement I had was to ensure the event ordering (worked around by disabling the java_execution), but since it is now working with it enabled, this is fine by me.

I don't really get why you need to disable the java_execution and I think you will need to find a way to get your config working with it since I think the ruby worker will be decommissionned in 8.0 if I remember well.

I'm reopening it then since this is not really solved for your use case.