fluent / fluent-plugin-kafka

Kafka input and output plugin for Fluentd
Other
303 stars 176 forks source link

SASL_PLAIN configuration does not work #484

Closed mahmoud-mahdi closed 1 year ago

mahmoud-mahdi commented 1 year ago

Describe the bug

I am trying to forward the logs to the Azure EventHub, which is using SASL_PLAIN but it does not work,

To Reproduce

add username and password in the fluentd kafka configuration part to enable the sasl_plain

Expected behavior

that the sasl_plain mechanism is used and the fluentd is able to establish the communication with the eventHub

Your Environment

- Fluentd version: fluentd-1.14.6
- TD Agent version: 
- fluent-plugin-kafka version: fluent-plugin-kafka-0.17.5
- ruby-kafka version: ruby-kafka-1.4.0
- Operating system: fluentd used in Openshift logging 5.6.2
- Kernel version:

Your Configuration

The configuration are as following:

# Ship logs to specific outputs

    <label @EVENTHUB>

      <match **>

        @type kafka2

        @id eventhub

        brokers NAMESPACE.servicebus.windows.net:9093

        default_topic evhName

        use_event_time true

        username "#{File.exists?('/var/run/ocp-collector/secrets/eventhub-secret/username') ? open('/var/run/ocp-collector/secrets/eventhub-secret/username','r') do |f|f.read end : ''}"

        password "#{File.exists?('/var/run/ocp-collector/secrets/eventhub-secret/password') ? open('/var/run/ocp-collector/secrets/eventhub-secret/password','r') do |f|f.read end : ''}"

        #sasl_over_ssl false

        #scram_mechanism PLAINTEXT

        <format>

          @type json

        </format>

        <buffer evhnName>

          @type file

          path '/var/lib/fluentd/eventhub'

          flush_mode interval

          flush_interval 1s

          flush_thread_count 2

          retry_type exponential_backoff

          retry_wait 1s

          retry_max_interval 60s

          retry_timeout 60m

          queued_chunks_limit_size "#{ENV['BUFFER_QUEUE_LIMIT'] || '32'}"

          total_limit_size "#{ENV['TOTAL_LIMIT_SIZE_PER_BUFFER'] || '8589934592'}"

          chunk_limit_size "#{ENV['BUFFER_SIZE_LIMIT'] || '8m'}"

          overflow_action block

          disable_chunk_backup true

        </buffer>

      </match>

    </label>

Your Error Log

$ oc logs collector-p9nbn -c collector 

POD_IPS: 10.128.3.198, PROM_BIND_IP: 0.0.0.0

Setting each total_size_limit for 2 buffers to 41190554112 bytes

Setting queued_chunks_limit_size for each buffer to 4910

Setting chunk_limit_size for each buffer to 8388608

/var/lib/fluentd/pos/journal_pos.json exists, checking if yajl parser able to parse this json file without any error.

ruby 2.7.6p219 (2022-04-12 revision c9c2245c0a) [x86_64-linux]

RUBY_GC_HEAP_OLDOBJECT_LIMIT_FACTOR=0.900000 (default value: 2.000000)

checking if /var/lib/fluentd/pos/journal_pos.json a valid json by calling yajl parser

2023-03-07 12:31:30 +0000 [warn]: '@' is the system reserved prefix. It works in the nested configuration for now but it will be rejected: @timestamp

2023-03-07 12:31:30 +0000 [warn]: '@' is the system reserved prefix. It works in the nested configuration for now but it will be rejected: @timestamp

/usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:8: warning: already initialized constant TRANSPORT_CLASS

/usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:3: warning: previous definition of TRANSPORT_CLASS was here

/usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:25: warning: already initialized constant SELECTOR_CLASS

/usr/local/share/gems/gems/fluent-plugin-elasticsearch-5.2.2/lib/fluent/plugin/elasticsearch_compat.rb:20: warning: previous definition of SELECTOR_CLASS was here

2023-03-07 12:31:33 +0000 [error]: unexpected error error_class=ArgumentError error="SASL authentication requires that SSL is configured"

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka/client.rb:120:in `initialize'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka.rb:366:in `new'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/ruby-kafka-1.4.0/lib/kafka.rb:366:in `new'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/lib/fluent/plugin/out_kafka2.rb:111:in `refresh_client'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluent-plugin-kafka-0.17.5/lib/fluent/plugin/out_kafka2.rb:196:in `start'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:203:in `block in start'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:182:in `block (2 levels) in lifecycle'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:121:in `block (2 levels) in lifecycle'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:120:in `each'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:120:in `block in lifecycle'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:113:in `each'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/agent.rb:113:in `lifecycle'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:181:in `block in lifecycle'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:178:in `each'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:178:in `lifecycle'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/root_agent.rb:202:in `start'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/engine.rb:248:in `start'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/engine.rb:147:in `run'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:720:in `block in run_worker'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:971:in `main_process'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/supervisor.rb:711:in `run_worker'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/lib/fluent/command/fluentd.rb:376:in `<top (required)>'

  2023-03-07 12:31:33 +0000 [error]: /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'

  2023-03-07 12:31:33 +0000 [error]: /usr/share/rubygems/rubygems/core_ext/kernel_require.rb:83:in `require'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/share/gems/gems/fluentd-1.14.6/bin/fluentd:15:in `<top (required)>'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/bin/fluentd:23:in `load'

  2023-03-07 12:31:33 +0000 [error]: /usr/local/bin/fluentd:23:in `<main>'

2023-03-07 12:31:33 +0000 [error]: unexpected error error_class=ArgumentError error="SASL authentication requires that SSL is configured"
 2023-03-07 12:31:33 +0000 [error]: suppressed same stacktrace

Additional context

No response

mahmoud-mahdi commented 1 year ago

It is working now sasl_over_ssl should be true and ssl_ca_cert should also be there