fluent / fluent-plugin-kafka

Kafka input and output plugin for Fluentd
Other
303 stars 176 forks source link

logging level #487

Open samar-elsayed opened 1 year ago

samar-elsayed commented 1 year ago

Describe the bug

I have tried log level 6 and 7 for rdkafka2 but I cant see log messages that describe fluentd sending to kafka like below for example

( this kind of log is appearing when using @type kafka_buffered ) Sending 6 messages to ip-10-1-17-50.eu-central-1.compute.internal:9080 (node_id=1)

but this doesnt happen with rdkafka, the only thing logged is

2023-04-06 10:26:22 +0000 [info]: #0 starting fluentd worker pid=18 ppid=6 worker=0
2023-04-06 10:26:22 +0000 [info]: #0 following tail of /logs/quarkus.log
2023-04-06 10:26:22 +0000 [info]: #0 fluentd worker is now running worker=0
2023-04-06 10:29:11 +0000 [warn]: #0 pattern not matched: "\tat java.base/java.util.Hashtable.get(Hashtable.java:383)"

so even the matched patterns dont appear in logs although they are matched and sent successfully to kafka

To Reproduce

use the same config as shown in Your Configuration section

Expected behavior

see log messages that describe fluentd sending to kafka like below for example :

( this kind of log is appearing when using @type kafka_buffered ) Sending 6 messages to ip-10-1-17-50.eu-central-1.compute.internal:9080 (node_id=1)

Your Environment

- Fluentd version: 1.14.0
- fluent-plugin-kafka version: 0.17.5
- ruby-kafka version: 1.5.0
- rdkafka: 0.12.0
- Operating system: Alpine Linux v3.13
- Kernel version: 5.15.58-flatcar

Your Configuration

<source>
   @type tail
   path /logs/quarkus.log
   tag file.all
   <parse>
       @type regexp
       expression /^(?<datetime>[0-9- :,]+) (?<host>[0-9a-zA-Z\-\.\+]+) (?<processname>.+?) (?<loglevel>.+) +\[(?<logger>[a-zA-Z-.]+?)\] \((?<thread>.+?)\) (?<logmessage>.+)$/
   </parse>
</source>

<match file.all>
   @type rewrite_tag_filter
   <rule>
       key logmessage
       pattern /ULFFRecord\:\ (?<ulffrecord>.+)$/
       tag file.ulff
   </rule>
   <rule>
       key logmessage
       pattern /./
       tag file.generic
   </rule>
</match>

<filter file.ulff>
   @type parser
   key_name logmessage
   <parse>
       @type regexp
       expression /^ULFFRecord\:\ (?<ulffrecord>.+)$/
   </parse>
</filter>

<filter file.ulff>
   @type parser
   format json
   key_name ulffrecord
</filter>

<match file.ulff>
   @type rdkafka2
   brokers "<broker>"
   get_kafka_client_log true
   default_topic ulff
   flush_interval 3s
   use_event_time true
   username "#{ENV["nonprd_ULFF_KAFKA_USER"]}"
   password "#{ENV["nonprd_ULFF_KAFKA_PASS"]}"
   rdkafka_options {
     "log_level":6,
     "sasl.mechanism": "SCRAM-SHA-512",
     "security.protocol": "sasl_ssl"
   }
   <buffer>
       flush_mode interval
       flush_interval 2s
   </buffer>
   <format>
     @type "json"
   </format>
</match>

<match file.generic>
   @type rdkafka2
   enable_ruby
   brokers "<broker>"
   get_kafka_client_log true
   default_topic custom
   use_event_time true
   username "#{ENV["nonprd_ULFF_KAFKA_USER"]}"
   password "#{ENV["nonprd_ULFF_KAFKA_PASS"]}"
   rdkafka_options {
     "log_level":6,
     "sasl.mechanism": "SCRAM-SHA-512",
     "security.protocol": "sasl_ssl"
   }
   <buffer>
       flush_mode interval
       flush_interval 2s
   </buffer>
   <format>
     @type "json"
   </format>
</match>

Your Error Log

but I cant see log messages that describe fluentd sending to kafka

Additional context

No response

samar-elsayed commented 1 year ago

@ashie

samar-elsayed commented 1 year ago

@repeatedly