logstash-plugins / logstash-codec-nmap

NMap XML decoding for logstash
Apache License 2.0
28 stars 25 forks source link

NMAP codec - HTTP gives a 500 status when using the codec on an http input but 200 status without the codec #13

Closed landonstewart closed 7 years ago

landonstewart commented 7 years ago

I've been following this tutorial:

Does anyone know why I might be seeing the line below in my logs and the codec not working? (getting a 500 error on the http connection, but a 200 without the codec). I believe the following line is the issue but I don't know how to make sure I'm using the latest codec.

[2017-03-24T16:20:17,614][INFO ][logstash.codecs.nmap     ] Using version 0.1.x codec plugin 'nmap'. This plugin isn't well supported by the community and likely has no maintainer.

Logstash will start up without issue and be ready for input on the port configured (8000) using the nmap codec. There's a warning in the debug output though saying "INFO logstash.codecs.nmap - Using version 0.1.x codec plugin 'nmap'. This plugin isn't well supported by the community and likely has no maintainer." I don't know if that is applicable here though.

The error I see in the debug output is this:

I also saw this error in the logs before using the debugging but here it is from the debug output:

17:52:14.411 [Ruby-0-Thread-20: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/puma-2.16.0-java/lib/puma/thread_pool.rb:61] ERROR logstash.inputs.http - unable to process event {"request_method"=>"POST", "request_path"=>"/", "request_uri"=>"/", "http_version"=>"HTTP/1.1", "http_host"=>"localhost:8000", "http_user_agent"=>"curl/7.47.0", "http_accept"=>"*/*", "http_x_nmap_target"=>"example.net", "content_length"=>"19525", "content_type"=>"application/x-www-form-urlencoded", "http_expect"=>"100-continue"}. exception => java.lang.ArrayIndexOutOfBoundsException: -1

As a result of the error above - I've stripped out much of the configuration for filters and the elasticsearch output for testing. Removing those parts makes no difference so I'm 99% sure the issue appears to be related directly to the 'codec => nmap' because if I use something like 'codec => plain' I get no error but instead I get the expected (although unusable) output.

What I have configured is in /etc/logstash/conf.d/12-input-nmap.conf:

input {
  http {
    port => 8000
    codec => nmap
    tags => [nmap]
  }
}

output {
  stdout {
    codec => rubydebug
  }
}

I run the configuration and send the NMAP xml output like this:

# nmap -A 192.168.1.0/24 -oX - | curl -v -H "x-nmap-target: local-subnet" http://192.168.10.20:8000 -d @-
* Rebuilt URL to: http://192.168.10.20:8000
* Hostname was NOT found in DNS cache
*   Trying 192.168.10.20...
* Connected to 192.168.10.20 (192.168.10.20) port 8000 (#0)
> POST / HTTP/1.1
> User-Agent: curl/7.35.0
> Host: 192.168.10.20:8000
> Accept: */*
> x-nmap-target: local-subnet
> Content-Length: 19525
> Content-Type: application/x-www-form-urlencoded
> Expect: 100-continue
> 
* Done waiting for 100-continue
< HTTP/1.1 500 Internal Server Error
< Content-Type: text/plain
< Content-Length: 14
* HTTP error before end of send, stop sending
< 
* Closing connection 0

It seems the codec just doesn't want to hear what curl has to say.

Here is the debug output in it's entirety in case there is any applicable/interesting information:

http://pastebin.com/5RZqLxdT

OTHER INFO:

# dpkg -l | grep logstash | awk '{ print $2"\t"$3 }'
logstash    1:5.2.2-1

# java -version
java version "1.8.0_121"
Java(TM) SE Runtime Environment (build 1.8.0_121-b13)
Java HotSpot(TM) 64-Bit Server VM (build 25.121-b13, mixed mode)

# ruby -v
ruby 2.4.0p0 (2016-12-24 revision 57164) [x86_64-linux]

PS. I posted this to https://discuss.elastic.co/t/nmap-codec-http-gives-a-500-status-when-using-the-codec-on-an-http-input-but-200-status-without-the-codec/79818 as well as tried #logstash on FreeNode to no avail (so far).

andrewvc commented 7 years ago

@TheZeke this is almost certainly a bug, thanks for reporting the issue.

To debug this we'll need to get a full stack trace. The HTTP output should really log full stack traces on fatal errors. Can you try running logstash with --log.level=debug and see if that gets more info?

I'll look into this next week for sure :)

landonstewart commented 7 years ago

Hello and thanks for the reply @andrewvc. Here is the debug output of a test this morning.

The issue in the debug output at the exact time I send nmap XML data looks like this:

12:50:45.019 [Ruby-0-Thread-20: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/puma-2.16.0-java/lib/puma/thread_pool.rb:61] ERROR logstash.inputs.http - unable to process event {"request_method"=>"POST", "request_path"=>"/", "request_uri"=>"/", "http_version"=>"HTTP/1.1", "http_user_agent"=>"curl/7.35.0", "http_host"=>"192.168.10.20:8000", "http_accept"=>"*/*", "http_x_nmap_target"=>"local-subnet", "content_length"=>"19525", "content_type"=>"application/x-www-form-urlencoded", "http_expect"=>"100-continue"}. exception => java.lang.ArrayIndexOutOfBoundsException: -1

If I remove everything except the input {} block using the nmap codec it does the same thing. If I remove the 'codec => nmap' from the input{} block there is no error but the data is obviously not usable by logstash then.

Config used (/etc/logstash/conf.d/12-input-nmap.conf ):

input {
    http {
    port => 8000
    codec => nmap
    tags => [nmap]
  }
}

filter {
  if "nmap" in [tags] {
    # Don't emit documents for 'down' hosts
    if [status][state] == "down" {
      drop {}
    }

    mutate {
      # Drop HTTP headers and logstash server hostname
      remove_field => ["headers", "hostname"]
    }

    if "nmap_traceroute_link" == [type] {
      geoip {
        source => "[to][address]"
        target => "[to][geoip]"
      }

      geoip {
        source => "[from][address]"
        target => "[from][geoip]"
      }
    }

    if [ipv4] {
      geoip {
        source => ipv4
        target => geoip
      }
    }

  }
}

output {
  if "nmap" in [tags] {
    elasticsearch {
      document_type => "%{[type]}"
      document_id => "%{[id]}"
      # Nmap data usually isn't too bad, so monthly rotation should be fine
      index => "nmap-logstash-%{+YYYY.MM}"
      template => "/etc/logstash/elasticsearch_nmap_template.json"
      template_name => "logstash_nmap"
      hosts => [ "esnode1-dev:9200","esnode2-dev:9200","esnode3-dev:9200" ]
    }
  }
}

Logstash debugging output:

# bin/logstash --log.level=debug -f /etc/logstash/conf.d/12-input-nmap.conf 
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs to console
12:50:10.353 [LogStash::Runner] DEBUG logstash.runner - -------- Logstash Settings (* means modified) ---------
12:50:10.356 [LogStash::Runner] DEBUG logstash.runner - node.name: "logstash1-dev.abuse-elk.corp.iweb.com"
12:50:10.356 [LogStash::Runner] DEBUG logstash.runner - *path.config: "/etc/logstash/conf.d/12-input-nmap.conf"
12:50:10.356 [LogStash::Runner] DEBUG logstash.runner - path.data: "/usr/share/logstash/data"
12:50:10.356 [LogStash::Runner] DEBUG logstash.runner - config.test_and_exit: false
12:50:10.356 [LogStash::Runner] DEBUG logstash.runner - config.reload.automatic: false
12:50:10.356 [LogStash::Runner] DEBUG logstash.runner - config.reload.interval: 3
12:50:10.356 [LogStash::Runner] DEBUG logstash.runner - metric.collect: true
12:50:10.356 [LogStash::Runner] DEBUG logstash.runner - pipeline.id: "main"
12:50:10.357 [LogStash::Runner] DEBUG logstash.runner - pipeline.workers: 2
12:50:10.357 [LogStash::Runner] DEBUG logstash.runner - pipeline.output.workers: 1
12:50:10.357 [LogStash::Runner] DEBUG logstash.runner - pipeline.batch.size: 125
12:50:10.357 [LogStash::Runner] DEBUG logstash.runner - pipeline.batch.delay: 5
12:50:10.357 [LogStash::Runner] DEBUG logstash.runner - pipeline.unsafe_shutdown: false
12:50:10.357 [LogStash::Runner] DEBUG logstash.runner - path.plugins: []
12:50:10.358 [LogStash::Runner] DEBUG logstash.runner - config.debug: false
12:50:10.358 [LogStash::Runner] DEBUG logstash.runner - *log.level: "debug" (default: "info")
12:50:10.358 [LogStash::Runner] DEBUG logstash.runner - version: false
12:50:10.358 [LogStash::Runner] DEBUG logstash.runner - help: false
12:50:10.358 [LogStash::Runner] DEBUG logstash.runner - log.format: "plain"
12:50:10.359 [LogStash::Runner] DEBUG logstash.runner - http.host: "127.0.0.1"
12:50:10.359 [LogStash::Runner] DEBUG logstash.runner - http.port: 9600..9700
12:50:10.359 [LogStash::Runner] DEBUG logstash.runner - http.environment: "production"
12:50:10.359 [LogStash::Runner] DEBUG logstash.runner - queue.type: "memory"
12:50:10.360 [LogStash::Runner] DEBUG logstash.runner - queue.page_capacity: 262144000
12:50:10.360 [LogStash::Runner] DEBUG logstash.runner - queue.max_bytes: 1073741824
12:50:10.360 [LogStash::Runner] DEBUG logstash.runner - queue.max_events: 0
12:50:10.360 [LogStash::Runner] DEBUG logstash.runner - queue.checkpoint.acks: 1024
12:50:10.360 [LogStash::Runner] DEBUG logstash.runner - queue.checkpoint.writes: 1024
12:50:10.360 [LogStash::Runner] DEBUG logstash.runner - queue.checkpoint.interval: 1000
12:50:10.360 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.warn: -1
12:50:10.360 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.info: -1
12:50:10.361 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.debug: -1
12:50:10.361 [LogStash::Runner] DEBUG logstash.runner - slowlog.threshold.trace: -1
12:50:10.361 [LogStash::Runner] DEBUG logstash.runner - path.queue: "/usr/share/logstash/data/queue"
12:50:10.361 [LogStash::Runner] DEBUG logstash.runner - path.settings: "/usr/share/logstash/config"
12:50:10.361 [LogStash::Runner] DEBUG logstash.runner - path.logs: "/usr/share/logstash/logs"
12:50:10.361 [LogStash::Runner] DEBUG logstash.runner - --------------- Logstash Settings -------------------
12:50:10.374 [LogStash::Runner] DEBUG logstash.agent - Agent: Configuring metric collection
12:50:10.376 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.os - PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
12:50:10.413 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.jvm - PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
12:50:10.450 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.persistentqueue - PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
12:50:10.456 [LogStash::Runner] DEBUG logstash.agent - Reading config file {:config_file=>"/etc/logstash/conf.d/12-input-nmap.conf"}
12:50:10.624 [LogStash::Runner] DEBUG logstash.plugins.registry - On demand adding plugin to the registry {:name=>"http", :type=>"input", :class=>LogStash::Inputs::Http}
12:50:11.219 [LogStash::Runner] DEBUG logstash.plugins.registry - On demand adding plugin to the registry {:name=>"nmap", :type=>"codec", :class=>LogStash::Codecs::Nmap}
12:50:11.222 [LogStash::Runner] INFO  logstash.codecs.nmap - Using version 0.1.x codec plugin 'nmap'. This plugin isn't well supported by the community and likely has no maintainer.
12:50:11.223 [LogStash::Runner] DEBUG logstash.codecs.nmap - config LogStash::Codecs::Nmap/@id = "nmap_9b48c7d3-dca8-4d5a-b6e9-e362e11d9821"
12:50:11.223 [LogStash::Runner] DEBUG logstash.codecs.nmap - config LogStash::Codecs::Nmap/@enable_metric = true
12:50:11.223 [LogStash::Runner] DEBUG logstash.codecs.nmap - config LogStash::Codecs::Nmap/@emit_scan_metadata = true
12:50:11.224 [LogStash::Runner] DEBUG logstash.codecs.nmap - config LogStash::Codecs::Nmap/@emit_hosts = true
12:50:11.225 [LogStash::Runner] DEBUG logstash.codecs.nmap - config LogStash::Codecs::Nmap/@emit_ports = true
12:50:11.226 [LogStash::Runner] DEBUG logstash.codecs.nmap - config LogStash::Codecs::Nmap/@emit_traceroute_links = true
12:50:11.227 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@port = 8000
12:50:11.228 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@codec = <LogStash::Codecs::Nmap id=>"nmap_9b48c7d3-dca8-4d5a-b6e9-e362e11d9821", enable_metric=>true, emit_scan_metadata=>true, emit_hosts=>true, emit_ports=>true, emit_traceroute_links=>true>
12:50:11.228 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@tags = ["nmap"]
12:50:11.228 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@id = "ac781c0c3a7476c1c4c0e49a2221cf6b6744f300-1"
12:50:11.228 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@enable_metric = true
12:50:11.228 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@add_field = {}
12:50:11.228 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@host = "0.0.0.0"
12:50:11.228 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@threads = 4
12:50:11.229 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@ssl = false
12:50:11.229 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@verify_mode = "none"
12:50:11.229 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@additional_codecs = {"application/json"=>"json"}
12:50:11.229 [LogStash::Runner] DEBUG logstash.inputs.http - config LogStash::Inputs::Http/@response_headers = {"Content-Type"=>"text/plain"}
12:50:11.231 [LogStash::Runner] DEBUG logstash.plugins.registry - On demand adding plugin to the registry {:name=>"drop", :type=>"filter", :class=>LogStash::Filters::Drop}
12:50:11.234 [LogStash::Runner] DEBUG logstash.filters.drop - config LogStash::Filters::Drop/@id = "ac781c0c3a7476c1c4c0e49a2221cf6b6744f300-2"
12:50:11.234 [LogStash::Runner] DEBUG logstash.filters.drop - config LogStash::Filters::Drop/@enable_metric = true
12:50:11.234 [LogStash::Runner] DEBUG logstash.filters.drop - config LogStash::Filters::Drop/@add_tag = []
12:50:11.234 [LogStash::Runner] DEBUG logstash.filters.drop - config LogStash::Filters::Drop/@remove_tag = []
12:50:11.235 [LogStash::Runner] DEBUG logstash.filters.drop - config LogStash::Filters::Drop/@add_field = {}
12:50:11.235 [LogStash::Runner] DEBUG logstash.filters.drop - config LogStash::Filters::Drop/@remove_field = []
12:50:11.235 [LogStash::Runner] DEBUG logstash.filters.drop - config LogStash::Filters::Drop/@periodic_flush = false
12:50:11.235 [LogStash::Runner] DEBUG logstash.filters.drop - config LogStash::Filters::Drop/@percentage = 100
12:50:11.243 [LogStash::Runner] DEBUG logstash.plugins.registry - On demand adding plugin to the registry {:name=>"mutate", :type=>"filter", :class=>LogStash::Filters::Mutate}
12:50:11.246 [LogStash::Runner] DEBUG logstash.filters.mutate - config LogStash::Filters::Mutate/@remove_field = ["headers", "hostname"]
12:50:11.246 [LogStash::Runner] DEBUG logstash.filters.mutate - config LogStash::Filters::Mutate/@id = "ac781c0c3a7476c1c4c0e49a2221cf6b6744f300-3"
12:50:11.246 [LogStash::Runner] DEBUG logstash.filters.mutate - config LogStash::Filters::Mutate/@enable_metric = true
12:50:11.247 [LogStash::Runner] DEBUG logstash.filters.mutate - config LogStash::Filters::Mutate/@add_tag = []
12:50:11.247 [LogStash::Runner] DEBUG logstash.filters.mutate - config LogStash::Filters::Mutate/@remove_tag = []
12:50:11.247 [LogStash::Runner] DEBUG logstash.filters.mutate - config LogStash::Filters::Mutate/@add_field = {}
12:50:11.247 [LogStash::Runner] DEBUG logstash.filters.mutate - config LogStash::Filters::Mutate/@periodic_flush = false
12:50:11.301 [LogStash::Runner] DEBUG logstash.plugins.registry - On demand adding plugin to the registry {:name=>"geoip", :type=>"filter", :class=>LogStash::Filters::GeoIP}
12:50:11.316 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@source = "[to][address]"
12:50:11.316 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@target = "[to][geoip]"
12:50:11.316 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@id = "ac781c0c3a7476c1c4c0e49a2221cf6b6744f300-4"
12:50:11.317 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@enable_metric = true
12:50:11.317 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@add_tag = []
12:50:11.317 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@remove_tag = []
12:50:11.317 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@add_field = {}
12:50:11.317 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@remove_field = []
12:50:11.317 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@periodic_flush = false
12:50:11.317 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@fields = ["city_name", "continent_code", "country_code2", "country_code3", "country_name", "dma_code", "ip", "latitude", "longitude", "postal_code", "region_name", "region_code", "timezone", "location"]
12:50:11.317 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@cache_size = 1000
12:50:11.318 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@lru_cache_size = 1000
12:50:11.318 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@tag_on_failure = ["_geoip_lookup_failure"]
12:50:11.334 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@source = "[from][address]"
12:50:11.334 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@target = "[from][geoip]"
12:50:11.334 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@id = "ac781c0c3a7476c1c4c0e49a2221cf6b6744f300-5"
12:50:11.334 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@enable_metric = true
12:50:11.335 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@add_tag = []
12:50:11.335 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@remove_tag = []
12:50:11.335 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@add_field = {}
12:50:11.335 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@remove_field = []
12:50:11.335 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@periodic_flush = false
12:50:11.335 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@fields = ["city_name", "continent_code", "country_code2", "country_code3", "country_name", "dma_code", "ip", "latitude", "longitude", "postal_code", "region_name", "region_code", "timezone", "location"]
12:50:11.335 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@cache_size = 1000
12:50:11.336 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@lru_cache_size = 1000
12:50:11.336 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@tag_on_failure = ["_geoip_lookup_failure"]
12:50:11.337 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@source = "ipv4"
12:50:11.338 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@target = "geoip"
12:50:11.338 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@id = "ac781c0c3a7476c1c4c0e49a2221cf6b6744f300-6"
12:50:11.338 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@enable_metric = true
12:50:11.338 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@add_tag = []
12:50:11.338 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@remove_tag = []
12:50:11.338 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@add_field = {}
12:50:11.338 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@remove_field = []
12:50:11.341 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@periodic_flush = false
12:50:11.341 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@fields = ["city_name", "continent_code", "country_code2", "country_code3", "country_name", "dma_code", "ip", "latitude", "longitude", "postal_code", "region_name", "region_code", "timezone", "location"]
12:50:11.341 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@cache_size = 1000
12:50:11.341 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@lru_cache_size = 1000
12:50:11.341 [LogStash::Runner] DEBUG logstash.filters.geoip - config LogStash::Filters::GeoIP/@tag_on_failure = ["_geoip_lookup_failure"]
12:50:11.485 [LogStash::Runner] DEBUG logstash.plugins.registry - On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch}
12:50:11.505 [LogStash::Runner] DEBUG logstash.plugins.registry - On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
12:50:11.507 [LogStash::Runner] DEBUG logstash.codecs.plain - config LogStash::Codecs::Plain/@id = "plain_c2e3214d-965f-498f-b1dc-9c68dc09d932"
12:50:11.508 [LogStash::Runner] DEBUG logstash.codecs.plain - config LogStash::Codecs::Plain/@enable_metric = true
12:50:11.509 [LogStash::Runner] DEBUG logstash.codecs.plain - config LogStash::Codecs::Plain/@charset = "UTF-8"
12:50:11.512 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@document_type = "%{[type]}"
12:50:11.512 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@document_id = "%{[id]}"
12:50:11.515 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@index = "nmap-logstash-%{+YYYY.MM}"
12:50:11.515 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@template = "/etc/logstash/elasticsearch_nmap_template.json"
12:50:11.515 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@template_name = "logstash_nmap"
12:50:11.516 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@hosts = [//esnode1-dev:9200, //esnode2-dev:9200, //esnode3-dev:9200]
12:50:11.516 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@id = "ac781c0c3a7476c1c4c0e49a2221cf6b6744f300-7"
12:50:11.516 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@enable_metric = true
12:50:11.517 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_c2e3214d-965f-498f-b1dc-9c68dc09d932", enable_metric=>true, charset=>"UTF-8">
12:50:11.517 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@workers = 1
12:50:11.517 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@manage_template = true
12:50:11.519 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@template_overwrite = false
12:50:11.519 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@parent = nil
12:50:11.520 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@idle_flush_time = 1
12:50:11.520 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@upsert = ""
12:50:11.520 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
12:50:11.521 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@script = ""
12:50:11.521 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@script_type = "inline"
12:50:11.521 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
12:50:11.521 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
12:50:11.521 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
12:50:11.522 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
12:50:11.522 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
12:50:11.522 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
12:50:11.522 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@pipeline = nil
12:50:11.523 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@action = "index"
12:50:11.523 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
12:50:11.525 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@sniffing = false
12:50:11.526 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
12:50:11.526 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@timeout = 60
12:50:11.526 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
12:50:11.526 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@pool_max = 1000
12:50:11.527 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
12:50:11.527 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@healthcheck_path = "/"
12:50:11.527 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@absolute_healthcheck_path = false
12:50:11.527 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
12:50:11.528 [LogStash::Runner] DEBUG logstash.outputs.elasticsearch - config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
12:50:11.534 [Ruby-0-Thread-3: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/task.rb:22] DEBUG logstash.agent - starting agent
12:50:11.538 [Ruby-0-Thread-3: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/task.rb:22] DEBUG logstash.agent - starting pipeline {:id=>"main"}
12:50:11.544 [[main]-pipeline-manager] DEBUG logstash.outputs.elasticsearch - Normalizing http path {:path=>nil, :normalized=>nil}
12:50:11.858 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://esnode1-dev:9200/, http://esnode2-dev:9200/, http://esnode3-dev:9200/]}}
12:50:11.859 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://esnode1-dev:9200/, :path=>"/"}
12:50:11.963 [[main]-pipeline-manager] WARN  logstash.outputs.elasticsearch - Restored connection to ES instance {:url=>#<URI::HTTP:0x190c9979 URL:http://esnode1-dev:9200/>}
12:50:11.965 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://esnode2-dev:9200/, :path=>"/"}
12:50:11.971 [[main]-pipeline-manager] WARN  logstash.outputs.elasticsearch - Restored connection to ES instance {:url=>#<URI::HTTP:0x4e87035b URL:http://esnode2-dev:9200/>}
12:50:11.972 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://esnode3-dev:9200/, :path=>"/"}
12:50:11.975 [[main]-pipeline-manager] WARN  logstash.outputs.elasticsearch - Restored connection to ES instance {:url=>#<URI::HTTP:0x3976b62f URL:http://esnode3-dev:9200/>}
12:50:11.975 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Using mapping template from {:path=>"/etc/logstash/elasticsearch_nmap_template.json"}
12:50:12.232 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - Attempting to install template {:manage_template=>{"template"=>"nmap-logstash-*", "mappings"=>{"nmap_scan_metadata"=>{"properties"=>{"@timestamp"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "arguments"=>{"type"=>"string", "analyzer"=>"whitespace"}, "host"=>{"type"=>"string", "index"=>"not_analyzed"}, "host_stats"=>{"properties"=>{"down"=>{"type"=>"long"}, "total"=>{"type"=>"long"}, "up"=>{"type"=>"long"}}}, "id"=>{"type"=>"string", "index"=>"not_analyzed"}, "run_stats"=>{"properties"=>{"elapsed"=>{"type"=>"double"}, "end_time"=>{"type"=>"string", "index"=>"not_analyzed"}, "exit_status"=>{"type"=>"string", "index"=>"not_analyzed"}, "summary"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "scan_id"=>{"type"=>"string", "index"=>"not_analyzed"}, "tags"=>{"type"=>"string", "index"=>"not_analyzed"}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}, "version"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "nmap_port"=>{"properties"=>{"@timestamp"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "address"=>{"type"=>"string", "index"=>"not_analyzed"}, "addresses"=>{"properties"=>{"addr"=>{"type"=>"string", "index"=>"not_analyzed"}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "arguments"=>{"type"=>"string", "analyzer"=>"whitespace", "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "end_time"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "geoip"=>{"properties"=>{"area_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "city_name"=>{"type"=>"string"}, "continent_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code2"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code3"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "dma_code"=>{"type"=>"integer"}, "ip"=>{"type"=>"ip"}, "latitude"=>{"type"=>"double"}, "location"=>{"type"=>"geo_point"}, "longitude"=>{"type"=>"double"}, "postal_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "real_region_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "region_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "timezone"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "host"=>{"type"=>"string", "index"=>"not_analyzed"}, "id"=>{"type"=>"string", "index"=>"not_analyzed"}, "ip"=>{"type"=>"string", "index"=>"not_analyzed"}, "ipv4"=>{"type"=>"ip"}, "os"=>{"properties"=>{"classes"=>{"properties"=>{"accuracy"=>{"type"=>"integer"}, "family"=>{"type"=>"string", "index"=>"not_analyzed"}, "gen"=>{"type"=>"string", "index"=>"not_analyzed"}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}, "vendor"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "matches"=>{"properties"=>{"accuracy"=>{"type"=>"long"}, "name"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "ports_used"=>{"type"=>"long"}}}, "port"=>{"properties"=>{"number"=>{"type"=>"integer"}, "protocol"=>{"type"=>"string"}, "reason"=>{"type"=>"string"}, "service"=>{"properties"=>{"confidence"=>{"type"=>"integer"}, "fingerprint_method"=>{"type"=>"string"}, "name"=>{"type"=>"string"}, "product"=>{"type"=>"string"}, "ssl"=>{"type"=>"boolean"}}}, "state"=>{"type"=>"string"}}}, "scan_host_id"=>{"type"=>"string", "index"=>"not_analyzed"}, "scan_id"=>{"type"=>"string", "index"=>"not_analyzed"}, "start_time"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "status"=>{"properties"=>{"reason"=>{"type"=>"string", "index"=>"not_analyzed"}, "state"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "tags"=>{"type"=>"string", "index"=>"not_analyzed"}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}, "uptime"=>{"properties"=>{"last_boot"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "seconds"=>{"type"=>"long"}}}, "version"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "nmap_traceroute_link"=>{"properties"=>{"@timestamp"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "address"=>{"type"=>"string", "index"=>"not_analyzed"}, "addresses"=>{"properties"=>{"addr"=>{"type"=>"string", "index"=>"not_analyzed"}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "arguments"=>{"type"=>"string", "analyzer"=>"whitespace", "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "end_time"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "from"=>{"properties"=>{"address"=>{"type"=>"string", "index"=>"not_analyzed", "fields"=>{"address_ipv4"=>{"type"=>"ip"}}}, "geoip"=>{"properties"=>{"area_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "city_name"=>{"type"=>"string"}, "continent_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code2"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code3"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "dma_code"=>{"type"=>"integer"}, "ip"=>{"type"=>"ip"}, "latitude"=>{"type"=>"double"}, "location"=>{"type"=>"geo_point"}, "longitude"=>{"type"=>"double"}, "postal_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "real_region_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "region_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "timezone"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "hostname"=>{"type"=>"string", "index"=>"not_analyzed"}, "index"=>{"type"=>"integer"}, "rtt"=>{"type"=>"long"}, "ttl"=>{"type"=>"integer"}}}, "geoip"=>{"properties"=>{"area_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "city_name"=>{"type"=>"string"}, "continent_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code2"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code3"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "dma_code"=>{"type"=>"integer"}, "ip"=>{"type"=>"ip"}, "latitude"=>{"type"=>"double"}, "location"=>{"type"=>"geo_point"}, "longitude"=>{"type"=>"double"}, "postal_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "real_region_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "region_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "timezone"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "host"=>{"type"=>"string", "index"=>"not_analyzed"}, "id"=>{"type"=>"string", "index"=>"not_analyzed"}, "ip"=>{"type"=>"string", "index"=>"not_analyzed"}, "ipv4"=>{"type"=>"ip"}, "os"=>{"properties"=>{"classes"=>{"properties"=>{"accuracy"=>{"type"=>"integer"}, "family"=>{"type"=>"string", "index"=>"not_analyzed"}, "gen"=>{"type"=>"string", "index"=>"not_analyzed"}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}, "vendor"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "matches"=>{"properties"=>{"accuracy"=>{"type"=>"long"}, "name"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "ports_used"=>{"type"=>"long"}}}, "rtt_diff"=>{"type"=>"integer"}, "scan_host_id"=>{"type"=>"string", "index"=>"not_analyzed"}, "scan_id"=>{"type"=>"string", "index"=>"not_analyzed"}, "start_time"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "status"=>{"properties"=>{"reason"=>{"type"=>"string", "index"=>"not_analyzed"}, "state"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "tags"=>{"type"=>"string", "index"=>"not_analyzed"}, "to"=>{"properties"=>{"address"=>{"type"=>"string", "index"=>"not_analyzed", "fields"=>{"address_ipv4"=>{"type"=>"ip"}}}, "geoip"=>{"properties"=>{"area_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "city_name"=>{"type"=>"string"}, "continent_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code2"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code3"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "dma_code"=>{"type"=>"integer"}, "ip"=>{"type"=>"ip"}, "latitude"=>{"type"=>"double"}, "location"=>{"type"=>"geo_point"}, "longitude"=>{"type"=>"double"}, "postal_code"=>{"type"=>"string"}, "real_region_name"=>{"type"=>"string"}, "region_name"=>{"type"=>"string"}, "timezone"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "hostname"=>{"type"=>"string", "index"=>"not_analyzed"}, "index"=>{"type"=>"integer"}, "rtt"=>{"type"=>"long"}, "ttl"=>{"type"=>"integer"}}}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}, "uptime"=>{"properties"=>{"last_boot"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "seconds"=>{"type"=>"long"}}}, "version"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "nmap_host"=>{"properties"=>{"@timestamp"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "@version"=>{"type"=>"string", "index"=>"not_analyzed"}, "address"=>{"type"=>"string", "index"=>"not_analyzed"}, "addresses"=>{"properties"=>{"addr"=>{"type"=>"string", "index"=>"not_analyzed"}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "arguments"=>{"type"=>"string", "analyzer"=>"whitespace", "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "end_time"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "geoip"=>{"properties"=>{"area_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "city_name"=>{"type"=>"string"}, "continent_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code2"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code3"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "dma_code"=>{"type"=>"integer"}, "ip"=>{"type"=>"ip"}, "latitude"=>{"type"=>"double"}, "location"=>{"type"=>"geo_point"}, "longitude"=>{"type"=>"double"}, "postal_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "real_region_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "region_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "timezone"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "host"=>{"type"=>"string", "index"=>"not_analyzed"}, "hostname"=>{"properties"=>{"name"=>{"type"=>"string", "index"=>"not_analyzed"}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "id"=>{"type"=>"string", "index"=>"not_analyzed"}, "ip"=>{"type"=>"string", "index"=>"not_analyzed"}, "ipv4"=>{"type"=>"ip"}, "mac"=>{"type"=>"string", "index"=>"not_analyzed"}, "os"=>{"properties"=>{"classes"=>{"properties"=>{"accuracy"=>{"type"=>"integer"}, "family"=>{"type"=>"string", "index"=>"not_analyzed"}, "gen"=>{"type"=>"string", "index"=>"not_analyzed"}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}, "vendor"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "matches"=>{"properties"=>{"accuracy"=>{"type"=>"long"}, "name"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "ports_used"=>{"type"=>"long"}}}, "ports"=>{"properties"=>{"number"=>{"type"=>"integer"}, "protocol"=>{"type"=>"string", "index"=>"not_analyzed"}, "reason"=>{"type"=>"string", "index"=>"not_analyzed"}, "service"=>{"properties"=>{"confidence"=>{"type"=>"long"}, "fingerprint_method"=>{"type"=>"string", "index"=>"not_analyzed"}, "name"=>{"type"=>"string", "index"=>"not_analyzed"}, "product"=>{"type"=>"string", "index"=>"not_analyzed"}, "ssl"=>{"type"=>"boolean"}}}, "state"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "scan_id"=>{"type"=>"string", "index"=>"not_analyzed"}, "start_time"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "status"=>{"properties"=>{"reason"=>{"type"=>"string", "index"=>"not_analyzed"}, "state"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "tags"=>{"type"=>"string", "index"=>"not_analyzed"}, "traceroute"=>{"properties"=>{"hops"=>{"properties"=>{"address"=>{"type"=>"string", "index"=>"not_analyzed"}, "geoip"=>{"properties"=>{"area_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "city_name"=>{"type"=>"string"}, "continent_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code2"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_code3"=>{"type"=>"string", "index"=>"not_analyzed"}, "country_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "dma_code"=>{"type"=>"integer"}, "ip"=>{"type"=>"ip"}, "latitude"=>{"type"=>"double"}, "location"=>{"type"=>"geo_point"}, "longitude"=>{"type"=>"double"}, "postal_code"=>{"type"=>"string", "index"=>"not_analyzed"}, "real_region_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "region_name"=>{"type"=>"string", "index"=>"not_analyzed"}, "timezone"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "hostname"=>{"type"=>"string", "index"=>"not_analyzed"}, "index"=>{"type"=>"integer"}, "rtt"=>{"type"=>"long"}, "ttl"=>{"type"=>"integer"}}}, "port"=>{"type"=>"integer"}, "protocol"=>{"type"=>"string", "index"=>"not_analyzed"}}}, "type"=>{"type"=>"string", "index"=>"not_analyzed"}, "uptime"=>{"properties"=>{"last_boot"=>{"type"=>"date", "format"=>"strict_date_optional_time||epoch_millis"}, "seconds"=>{"type"=>"long"}, "uptime"=>{"type"=>"integer"}}}, "version"=>{"type"=>"string", "index"=>"not_analyzed"}}}}}}
12:50:12.246 [[main]-pipeline-manager] DEBUG logstash.outputs.elasticsearch - Found existing Elasticsearch template. Skipping template management {:name=>"logstash_nmap"}
12:50:12.247 [[main]-pipeline-manager] INFO  logstash.outputs.elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x2883e3c1 URL://esnode1-dev:9200>, #<URI::Generic:0xfc70d4a URL://esnode2-dev:9200>, #<URI::Generic:0x2018ee62 URL://esnode3-dev:9200>]}
12:50:12.249 [[main]-pipeline-manager] INFO  logstash.filters.geoip - Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.0.4-java/vendor/GeoLite2-City.mmdb"}
12:50:12.265 [[main]-pipeline-manager] INFO  logstash.filters.geoip - Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.0.4-java/vendor/GeoLite2-City.mmdb"}
12:50:12.267 [[main]-pipeline-manager] INFO  logstash.filters.geoip - Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.0.4-java/vendor/GeoLite2-City.mmdb"}
12:50:12.270 [[main]-pipeline-manager] INFO  logstash.pipeline - Starting pipeline {"id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250}
12:50:12.293 [[main]-pipeline-manager] DEBUG logstash.plugins.registry - On demand adding plugin to the registry {:name=>"json", :type=>"codec", :class=>LogStash::Codecs::JSON}
12:50:12.296 [[main]-pipeline-manager] DEBUG logstash.codecs.json - config LogStash::Codecs::JSON/@id = "json_4d6852a9-86a5-41db-b650-013bc68d1e3a"
12:50:12.297 [[main]-pipeline-manager] DEBUG logstash.codecs.json - config LogStash::Codecs::JSON/@enable_metric = true
12:50:12.297 [[main]-pipeline-manager] DEBUG logstash.codecs.json - config LogStash::Codecs::JSON/@charset = "UTF-8"
12:50:12.298 [[main]-pipeline-manager] INFO  logstash.pipeline - Pipeline main started
12:50:12.321 [Api Webserver] DEBUG logstash.agent - Starting puma
12:50:12.395 [Api Webserver] DEBUG logstash.agent - Trying to start WebServer {:port=>9600}
12:50:12.397 [Api Webserver] DEBUG logstash.api.service - [api-service] start
12:50:12.420 [Api Webserver] INFO  logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
12:50:17.304 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
12:50:22.305 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
12:50:27.305 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
12:50:32.305 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
12:50:37.305 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
12:50:42.307 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
12:50:45.019 [Ruby-0-Thread-20: /usr/share/logstash/vendor/bundle/jruby/1.9/gems/puma-2.16.0-java/lib/puma/thread_pool.rb:61] ERROR logstash.inputs.http - unable to process event {"request_method"=>"POST", "request_path"=>"/", "request_uri"=>"/", "http_version"=>"HTTP/1.1", "http_user_agent"=>"curl/7.35.0", "http_host"=>"192.168.10.20:8000", "http_accept"=>"*/*", "http_x_nmap_target"=>"local-subnet", "content_length"=>"19525", "content_type"=>"application/x-www-form-urlencoded", "http_expect"=>"100-continue"}. exception => java.lang.ArrayIndexOutOfBoundsException: -1
12:50:47.306 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
12:50:52.307 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
12:50:57.307 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
12:51:02.307 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
^C12:51:05.841 [SIGINT handler] WARN  logstash.runner - SIGINT received. Shutting down the agent.
12:51:05.843 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.os - PeriodicPoller: Stopping
12:51:05.844 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.jvm - PeriodicPoller: Stopping
12:51:05.844 [LogStash::Runner] DEBUG logstash.instrument.periodicpoller.persistentqueue - PeriodicPoller: Stopping
12:51:05.846 [LogStash::Runner] WARN  logstash.agent - stopping pipeline {:id=>"main"}
12:51:05.847 [LogStash::Runner] DEBUG logstash.pipeline - Closing inputs
12:51:05.847 [LogStash::Runner] DEBUG logstash.inputs.http - stopping {:plugin=>"LogStash::Inputs::Http"}
12:51:05.850 [[main]<http] DEBUG logstash.inputs.http - closing {:plugin=>"LogStash::Inputs::Http"}
12:51:05.850 [[main]-pipeline-manager] DEBUG logstash.pipeline - Input plugins stopped! Will shutdown filter/output workers.
12:51:05.851 [LogStash::Runner] DEBUG logstash.pipeline - Closed inputs
12:51:05.915 [Ruby-0-Thread-9: /usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:538] DEBUG logstash.pipeline - Pushing flush onto pipeline
12:51:05.916 [[main]-pipeline-manager] DEBUG logstash.pipeline - Pushing shutdown {:thread=>"#<Thread:0x63be73e6 sleep>"}
12:51:05.916 [[main]-pipeline-manager] DEBUG logstash.pipeline - Pushing shutdown {:thread=>"#<Thread:0x52440c85 run>"}
12:51:05.916 [[main]-pipeline-manager] DEBUG logstash.pipeline - Shutdown waiting for worker thread #<Thread:0x63be73e6>
12:51:05.931 [[main]-pipeline-manager] DEBUG logstash.pipeline - Shutdown waiting for worker thread #<Thread:0x52440c85>
12:51:05.931 [[main]-pipeline-manager] DEBUG logstash.filters.drop - closing {:plugin=>"LogStash::Filters::Drop"}
12:51:05.931 [[main]-pipeline-manager] DEBUG logstash.filters.mutate - closing {:plugin=>"LogStash::Filters::Mutate"}
12:51:05.931 [[main]-pipeline-manager] DEBUG logstash.filters.geoip - closing {:plugin=>"LogStash::Filters::GeoIP"}
12:51:05.932 [[main]-pipeline-manager] DEBUG logstash.filters.geoip - closing {:plugin=>"LogStash::Filters::GeoIP"}
12:51:05.932 [[main]-pipeline-manager] DEBUG logstash.filters.geoip - closing {:plugin=>"LogStash::Filters::GeoIP"}
12:51:05.932 [[main]-pipeline-manager] DEBUG logstash.outputs.elasticsearch - closing {:plugin=>"LogStash::Outputs::ElasticSearch"}
12:51:05.932 [[main]-pipeline-manager] DEBUG logstash.outputs.elasticsearch - Stopping sniffer
12:51:05.932 [[main]-pipeline-manager] DEBUG logstash.outputs.elasticsearch - Stopping resurrectionist
12:51:05.994 [[main]-pipeline-manager] DEBUG logstash.outputs.elasticsearch - Waiting for in use manticore connections
12:51:05.994 [[main]-pipeline-manager] DEBUG logstash.outputs.elasticsearch - Closing adapter #<LogStash::Outputs::ElasticSearch::HttpClient::ManticoreAdapter:0x20790c1c>
12:51:05.995 [[main]-pipeline-manager] DEBUG logstash.pipeline - Pipeline main has been shutdown

Sent data to logstash like this:

nmap -A 192.168.55.0/24 -oX - | curl -H "x-nmap-target: local-subnet" http://192.168.10.20:8000 -d @-
landonstewart commented 7 years ago

Also @andrewvc, I must mention this.

With the codec:

root@logstash1-dev:/usr/share/logstash# head -7 /etc/logstash/conf.d/12-input-nmap.conf 
input {
    http {
    port => 8000
    codec => nmap
    tags => [nmap]
  }
}
root@logstash1-dev:/usr/share/logstash# service logstash restart
root@logstash1-dev:/usr/share/logstash# curl -I localhost:8000
HTTP/1.1 500 Internal Server Error
Content-Type: text/plain
Content-Length: 14

Without the codec:

root@logstash1-dev:/usr/share/logstash# head -7 /etc/logstash/conf.d/12-input-nmap.conf 
input {
    http {
    port => 8000
#    codec => nmap
    tags => [nmap]
  }
}
root@logstash1-dev:/usr/share/logstash# service logstash restart
root@logstash1-dev:/usr/share/logstash# curl -I localhost:8000
HTTP/1.1 200 OK
Content-Type: text/plain
Content-Length: 2
andrewvc commented 7 years ago

@TheZeke ::sigh:: this is almost certainly due to the variability of the NMAP XML format. As far as XML formats go that one is interesting.... I can't reproduce it with scan data on my local network. Yours must have some output mine doesn't. Do you mind sharing your scan.xml with me? You should be able to just pipe it to a file with > scan.xml instead of sending it to cURL. In an ideal world we could add your scan XML to the spec fixtures where they'd be part of the test suite. I'd love it if you could open a PR adding your file to that tree :)

I've opened a PR on the HTTP input to make debugging this easier as well, but I really need the scan data to fix the codec.

landonstewart commented 7 years ago

Hello @andrewvc,

The XML requested is attached. I've obfuscated the hostname and the IP address that was scanned but left everything else intact.

scan.xml.gz

I cannot even get a 200 status from the http input {} when using the nmap codec with a simple GET. I can however get a 200 status from the http input {} when commenting the codec out though. This is all before XML is even sent. Because of this I'm not sure it's related to the XML format from nmap at this point. I could be wrong of course though. This is shown in my update yesterday here.

Here's what happens when I pipe the XML directly to the curl command:

# time cat scan.xml | curl -H "x-nmap-target: local-subnet" http://192.168.10.20:8000 -d @-
<html><body><h1>503 Service Unavailable</h1>
No server is available to handle this request.
</body></html>

real    0m0.024s
user    0m0.005s
sys 0m0.017s
andrewvc commented 7 years ago

@TheZeke I'm realizing that he problem is the use of -d in cURL, where it should be --data-binary. That is mangling the body content.

I'll update the blog post and examples and close this ticket after that.

andrewvc commented 7 years ago

@TheZeke OK, I updated https://www.elastic.co/blog/using-nmap-logstash-to-gain-insight-into-your-network the use --data-binary. Will close this ticket now :)

Please open a new one if you have continuing issues!