logstash-plugins / logstash-output-influxdb

Apache License 2.0
58 stars 79 forks source link

Parsing data from the metrics filter #48

Open elvarb opened 8 years ago

elvarb commented 8 years ago

When you have a config like this

        metrics {
                meter => "winevent_%{Channel}"
                add_tag => "metric"
        }

The result JSON will look something like this

{
  "@version": "1",
  "@timestamp": "2016-08-24T13:42:25.207Z",
  "message": "loginput",
  "winevent_Application": {
    "count": 208,
    "rate_1m": 0.017887484125736455,
    "rate_5m": 0.027711454136902635,
    "rate_15m": 0.029916670269682488
  },
  "winevent_Security": {
    "count": 35937,
    "rate_1m": 9.322763401877156,
    "rate_5m": 7.82061391723383,
    "rate_15m": 6.896426179028346
  },
  "winevent_System": {
    "count": 1249,
    "rate_1m": 0.13826098210302384,
    "rate_5m": 0.1701106403241058,
    "rate_15m": 0.1892950940574345
  },
  "tags": [
    "metric"
  ]
}

Using use_event_fields_for_data_points I would assume it would create metrics called winevent_Application with the metric values count, rate_1m and so on but sadly it doesn't seem to work.

Is there some way to get this correctly into InfluxDB with the output plugin?

CliveJL commented 7 years ago

+1 - I have just hit this issue with a similar config using the "use_event_fields_for_data_points" option. The Logstash instance's log generates the following:

[2017-02-27T16:23:11,985][WARN ][logstash.outputs.influxdb] Failed to flush outgoing items {:outgoing_count=>1, :exception=>"NoMethodError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-influxdb-4.0.0/lib/logstash/outputs/influxdb.rb:340:in `quoted'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-influxdb-4.0.0/lib/logstash/outputs/influxdb.rb:225:in `events_to_request_body'", "org/jruby/RubyHash.java:1342:in `each'", "org/jruby/RubyEnumerable.java:757:in `map'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-influxdb-4.0.0/lib/logstash/outputs/influxdb.rb:225:in `events_to_request_body'", "org/jruby/RubyArray.java:2414:in `map'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-influxdb-4.0.0/lib/logstash/outputs/influxdb.rb:222:in `events_to_request_body'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-influxdb-4.0.0/lib/logstash/outputs/influxdb.rb:169:in `flush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:221:in `buffer_flush'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:216:in `buffer_flush'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:159:in `buffer_receive'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-influxdb-4.0.0/lib/logstash/outputs/influxdb.rb:163:in `receive'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:92:in `multi_receive'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:92:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/legacy.rb:19:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:43:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:336:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:335:in `output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:293:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:263:in `start_workers'"]}

I haven't had a problem when defining other metric events with more "static" field names manually in the Output config, using the "data_points" array and also "coerce_values". This issue seems to occur with the "automatic" option ("use_event_fields_for_data_points").