yokawasa / fluent-plugin-azure-loganalytics

Azure Log Analytics output plugin for Fluentd
https://rubygems.org/gems/fluent-plugin-azure-loganalytics
Apache License 2.0
53 stars 9 forks source link

Fix incorrect statement split, use more relaxed JSON encoder #10

Closed smira closed 5 years ago

smira commented 5 years ago

log.fatal statement was split incorrectly into lines, so that part of the message is actually not logged.

Use 'yajl' instead of default JSON encoder, due to the following problem observed:

2019-07-29 19:47:09 +0000 [warn]: #0 failed to flush the buffer. retry_time=0 next_retry_seconds=2019-07-29 19:47:10 +0000 chunk="58e09d319b82a042ddc2fef0edb883b1" error_class=Encoding::UndefinedConversionError error="\"\\xE2\" from ASCII-8BIT to UTF-8"
  2019-07-29 19:47:09 +0000 [warn]: #0 /usr/lib/ruby/gems/2.5.0/gems/fluent-plugin-azure-loganalytics-0.3.1/lib/fluent/plugin/out_azure-loganalytics.rb:101:in `encode'
  2019-07-29 19:47:09 +0000 [warn]: #0 /usr/lib/ruby/gems/2.5.0/gems/fluent-plugin-azure-loganalytics-0.3.1/lib/fluent/plugin/out_azure-loganalytics.rb:101:in `to_json'
  2019-07-29 19:47:09 +0000 [warn]: #0 /usr/lib/ruby/gems/2.5.0/gems/fluent-plugin-azure-loganalytics-0.3.1/lib/fluent/plugin/out_azure-loganalytics.rb:101:in `rescue in write'
  2019-07-29 19:47:09 +0000 [warn]: #0 /usr/lib/ruby/gems/2.5.0/gems/fluent-plugin-azure-loganalytics-0.3.1/lib/fluent/plugin/out_azure-loganalytics.rb:93:in `write'
  2019-07-29 19:47:09 +0000 [warn]: #0 /usr/lib/ruby/gems/2.5.0/gems/fluentd-1.4.2/lib/fluent/plugin/output.rb:1125:in `try_flush'
  2019-07-29 19:47:09 +0000 [warn]: #0 /usr/lib/ruby/gems/2.5.0/gems/fluentd-1.4.2/lib/fluent/plugin/output.rb:1425:in `flush_thread_run'
  2019-07-29 19:47:09 +0000 [warn]: #0 /usr/lib/ruby/gems/2.5.0/gems/fluentd-1.4.2/lib/fluent/plugin/output.rb:454:in `block (2 levels) in start'
  2019-07-29 19:47:09 +0000 [warn]: #0 /usr/lib/ruby/gems/2.5.0/gems/fluentd-1.4.2/lib/fluent/plugin_helper/thread.rb:78:in `block in thread_create'
yokawasa commented 5 years ago

@smira Thank you so much for the PR. I confirmed it worked perfect!

smira commented 5 years ago

thanks, that was really fast!

I don't know what is the best fix, but I think there is issue with this log line if records is big enough - it might eat a lot of memory while formatting the message. Not sure what the best fix might be here, probably just cut to some reasonable amount of messages, like 32kb?