Closed AndreAga closed 2 years ago
I agree with @AndreAga, also Logstash's Kafka output plugin supports Kerberos SASL. Kerberos Auth is a must-have feature for Bests Library.
@urso any news regarding this?
@gmoskovicz Sorry, no updates on this ticket.
+1
+1
+1
+1
+1
@jsoriano Is there a plan to add Kerberos Support in beats?
+1
+1
+1
+1
+1
One of our customers offered this response on why Kerberos is better than SSL:
In the case of the Confluent article [showing SSL auth], they are using a very loose term for authentication, in saying that it is performing mutual authentication of the certificates and only validates that the certificate is trusted by way of the CA certificates. In other words, this is machine authentication, and it provides no context to the user on that machine. In our security context, SSL authentication is not sufficient.
The feature has been released in v7.7 as a beta.
I've been running filebeat 7.7 kafka output with the following kerberos configurations:
output.kafka: hosts: xxxx topic: xxxx required_acks: 0 compression: none max_message_bytes: 1000000 kerberos.enabled: true kerberos.auth_type: keytab kerberos.username: xxx kerberos.keytab: xxx kerberos.service_name: kafka kerberos.config_path: /etc/krb5.conf kerberos.realm: xxxx
i've verified that kinit works with my principal/keytab, but i"m getting the following error:
2020-05-28T10:54:54.156-0400 DEBUG [kafka] kafka/client.go:290 Kafka publish failed with: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)
Im sure this is not a kafka side problem, because ive used the same principal/keytab to send logs to kafka via the console kafka producer, and this was successful. Any ideas what the problem might be, or how i can debug further (debug logs already enabled).
There was an issue with the lib we use for Kafka. But it has been updated in the repo and it will be fixed hopefully in v7.7.1. Do you mind testing again when the patch release comes out?
For debugging the issue, I suggest you enable debug logging in Kafka. It will tell you the exact error during the authtentication.
thanks for getting back to me! yes i can test when the patch comes out--is there a timeline for when that's happening?
It is going to be available in early June.
thanks, will look out for an update on this thread. In the meanwhile I've been trying to repro the error by running a standalone shopify/sarama kafka producer (which i believe is the library filebeat is using) and i'm getting the following error:
13:53:06 Error while performing GSSAPI Kerberos Authentication: EOF
Is this the problem you're referring to?
I see you've already merged the fix into shopify sarama here: https://github.com/Shopify/sarama/pull/1697 look forward to seeing the patch for filebeat, thanks!
@agarwri The new version has been released. Let me know if you still have problems with Kerberos for Kafka.
Im getting a new error now:
2020-06-15T12:55:54.282-0400 DEBUG [kafka] kafka/client.go:290 Kafka publish failed with: kafka server: Request was for a topic or partition that does not exist on this broker.
even though the topic definitely exists.
@agarwri Could you please open a separate issue and add the debug logs of Kafka?
Hi - does filebeat support gssapi mechanism. I keep getting error error initializing publisher:Kafka: invalid configuration net.sasl.gssapi.username must not be empty when gssapi mechanism is used. I already gave this property in config, still it error. Any leads. Thanks.
@kvch can you please check above comment. I am testing on filebeat 7.7.1. Thanks.
Could you share your config?
Thanks for the response.
Attached Filebeat.yml which we use. we are testing using Filebeat 7.8 version.
I recieve below error: "Exiting: error initializing publisher: kafka: invalid configuration (Net.SASL.GSSAPI.Username must not be empty when GSS-API mechanism is used)"
Thanks, Mouli
On Wed, Aug 19, 2020 at 8:04 AM Noémi Ványi notifications@github.com wrote:
Could you share your config?
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/elastic/beats/issues/5413#issuecomment-676321717, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD2UZLQ2LTAHMUPDFUOM74TSBPEVJANCNFSM4EAA7QHQ .
###################### Filebeat Configuration Example #########################
#
filebeat.inputs:
type: log
enabled: true
paths:
filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false
setup.template.settings: index.number_of_shards: 1
setup
command.setup.kibana:
output.elasticsearch.hosts
andsetup.kibana.host
options.cloud.id
in the Elastic Cloud web UI.output.elasticsearch.username
andoutput.elasticsearch.password
settings. The format is <user>:<pass>
.http
(default) or https
.output.kafka:
enabled: true
hosts: ["broker.visa.com:9093"]
%{[type]}
.topic: 'filetest'
hash
output.kafka.key
setting or randomly distributes events ifoutput.kafka.key
is not configured.# If enabled, events will only be published to partitions with reachable
# leaders. Default is false.
#reachable_only: false
# Configure alternative event field names used to compute the hash value.
# If empty `output.kafka.key` setting will be used.
# Default value is empty list.
#hash: []
# Pretty-print JSON event
#pretty: false
# Configure escaping HTML symbols in strings.
#escape_html: false
# Max metadata request retry attempts when cluster is in middle of leader
# election. Defaults to 3 retries.
#retry.max: 3
# Wait time between retries during leader elections. Default is 250ms.
#retry.backoff: 250ms
# Refresh metadata interval. Defaults to every 10 minutes.
#refresh_frequency: 10m
# Strategy for fetching the topics metadata from the broker. Default is false.
#full: false
required_acks: 1
ssl.enabled: true
ssl.certificate_authorities: C:\Users\chasrini\Documents\Projects\Kafka\Filebeat\conf\neo4jnonproductionclient.pem
none
is configured, all server hostsfull
.ssl.certificate: C:\Users\xxxxx\Documents\Projects\Kafka\Filebeat\conf\nonproductionclient.pem
ssl.key: C:\Users\xxxxx\Documents\Projects\Kafka\Filebeat\conf\nonproductionclient.pem
ssl.key_passphrase: 'passphrase'
kerberos.auth_type: keytab
kerberos.keytab: C:\Users\chasrini\Documents\Projects\Kafka\Filebeat\conf\zookeeper_broker.visa.com.keytab
kerberos.config_path: C:\Users\chasrini\Documents\Projects\Kafka\Filebeat\conf\krb5.conf
kerberos.service_name: kafka
kerberos.realm: CORPDEV.VISA.COM
processors:
logging.level: info
@moulisea ATM kerberos.username
is commented out. You need to remove #
from the beginning of the line.
I did that and I see below two errors. In the initial logs, it says it established kafka connection. But later I see below errors.
DEBUG [harvester] log/log.go:107 End of file reached: E:\Logs\file.log; Backoff now. DEBUG [kafka] kafka/client.go:277 finished kafka batch DEBUG [kafka] kafka/client.go:291 Kafka publish failed with: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)
Kafka publish failed with: circuit breaker is open
Thanks, Mouli
On Wed, Aug 19, 2020 at 10:31 AM Noémi Ványi notifications@github.com wrote:
@moulisea https://github.com/moulisea ATM kerberos.username is commented out. You need to remove # from the beginning of the line.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/elastic/beats/issues/5413#issuecomment-676498047, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD2UZLQIPOWRM7G24RWJDHDSBPV5TANCNFSM4EAA7QHQ .
I tried logging: debug in YML file but not helpful to find what is the issue. If you are aware of any debugging mechanism, let me know. Thanks.
Thanks, Mouli
On Wed, Aug 19, 2020 at 12:23 PM chandramouli srinivasan < c84.srinivasan@gmail.com> wrote:
I did that and I see below two errors. In the initial logs, it says it established kafka connection. But later I see below errors.
DEBUG [harvester] log/log.go:107 End of file reached: E:\Logs\file.log; Backoff now. DEBUG [kafka] kafka/client.go:277 finished kafka batch DEBUG [kafka] kafka/client.go:291 Kafka publish failed with: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)
Kafka publish failed with: circuit breaker is open
Thanks, Mouli
On Wed, Aug 19, 2020 at 10:31 AM Noémi Ványi notifications@github.com wrote:
@moulisea https://github.com/moulisea ATM kerberos.username is commented out. You need to remove # from the beginning of the line.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/elastic/beats/issues/5413#issuecomment-676498047, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD2UZLQIPOWRM7G24RWJDHDSBPV5TANCNFSM4EAA7QHQ .
when I give output.kafka.version: "2.5.0", I get below error:
ERROR instance/beat.go:958 Exiting: error initializing publisher: unknown/unsupported kafka vesion '2.5.0' accessing 'output.kafka.version' (source:'filebeat.yml') Exiting: error initializing publisher: unknown/unsupported kafka vesion '2.5.0' accessing 'output.kafka.version' (source:'filebeat.yml')
Also I see below Alert in Filebeat.
Known issue in version 7.8.0
The Kafka output fails to connect when using multiple TLS brokers. We advise not to upgrade to Filebeat 7.8.0 if you’re using the Kafka output in this configuration.
Do you recommend to use 7.8.0 in Prod with multiple broker( we have 3 broker). I started using 7.8 mainly to support SASL_SSL with GSSAPI mechanism.
Thanks, Mouli
On Wed, Aug 19, 2020 at 11:05 PM chandramouli srinivasan < c84.srinivasan@gmail.com> wrote:
I tried logging: debug in YML file but not helpful to find what is the issue. If you are aware of any debugging mechanism, let me know. Thanks.
Thanks, Mouli
On Wed, Aug 19, 2020 at 12:23 PM chandramouli srinivasan < c84.srinivasan@gmail.com> wrote:
I did that and I see below two errors. In the initial logs, it says it established kafka connection. But later I see below errors.
DEBUG [harvester] log/log.go:107 End of file reached: E:\Logs\file.log; Backoff now. DEBUG [kafka] kafka/client.go:277 finished kafka batch DEBUG [kafka] kafka/client.go:291 Kafka publish failed with: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)
Kafka publish failed with: circuit breaker is open
Thanks, Mouli
On Wed, Aug 19, 2020 at 10:31 AM Noémi Ványi notifications@github.com wrote:
@moulisea https://github.com/moulisea ATM kerberos.username is commented out. You need to remove # from the beginning of the line.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/elastic/beats/issues/5413#issuecomment-676498047, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD2UZLQIPOWRM7G24RWJDHDSBPV5TANCNFSM4EAA7QHQ .
Do you mind opening a Discuss question for these problems? This issue is about tracking Kerberos authentication for Kafka, not arbitrary Kafka issues.
Thanks for response. I am not clear how to open a Discuss question for these problems. can you provide some link, so I can start discussion on this.
Yes, the issue is not able to connect to Kafka kerberos from Filebeat. so I am exploring logstash now. But if we could make this work in Filebeat, that would be really great.
Appreciate your help. Thanks.
Thanks, Mouli
Sent from my iPhone
On Aug 24, 2020, at 12:01 PM, Noémi Ványi notifications@github.com wrote:
Do you mind opening a Discuss question for these problems? This issue is about tracking Kerberos authentication for Kafka, not arbitrary Kafka issues.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/elastic/beats/issues/5413#issuecomment-679249454, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD2UZLVPXZSWN2KHOQOSMLDSCKMINANCNFSM4EAA7QHQ .
I meant opening a question here: https://discuss.elastic.co/c/elastic-stack/beats/28 I will find someone to help you there. Thanks in advance.
Thanks. I have created one. link below.
https://discuss.elastic.co/t/filebeat-connect-with-kafka-kerberos-sasl-ssl-not-working/246160
Thanks, Mouli
On Wed, Aug 19, 2020 at 12:23 PM chandramouli srinivasan < c84.srinivasan@gmail.com> wrote:
I did that and I see below two errors. In the initial logs, it says it established kafka connection. But later I see below errors.
DEBUG [harvester] log/log.go:107 End of file reached: E:\Logs\file.log; Backoff now. DEBUG [kafka] kafka/client.go:277 finished kafka batch DEBUG [kafka] kafka/client.go:291 Kafka publish failed with: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)
Kafka publish failed with: circuit breaker is open
Thanks, Mouli
On Wed, Aug 19, 2020 at 10:31 AM Noémi Ványi notifications@github.com wrote:
@moulisea https://github.com/moulisea ATM kerberos.username is commented out. You need to remove # from the beginning of the line.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/elastic/beats/issues/5413#issuecomment-676498047, or unsubscribe https://github.com/notifications/unsubscribe-auth/AD2UZLQIPOWRM7G24RWJDHDSBPV5TANCNFSM4EAA7QHQ .
same problem... did you find the solution?
I am having same issue with filebeat-7.11.1
Any solution on sasl_ssl?
Connection to kafka(xxxxx:9093) established [kafka] kafka/client.go:371 finished kafka batch [kafka] kafka/client.go:385 Kafka publish failed with: kafka: client has run out of available brokers to talk to (Is your cluster reachable?)
Same problem. Searching for a link to a solution.
Can you reproduce the issue with 7.12?
We were on 7.10 from the standard rpm repository of Red Hat I believe. I'll ask to try on filebeat 7.12
I noticed that it's mostly a Kerberos issue, there is just no log about it. I noticed we're forced to put the service_name parameter, even though the authentification is successful without it (and with it, it actually fails... I'm trying to understand why on my side.
Ok, I confirm that with 7.12, we're able to make it work with password
type of authentification with kerberos (and clear text password in the configuration). However, as soon as we switch this to auth_type: keytab
and pass the keytab
option, it stops working. I have confirmed the keytab authentification with kinit
and with the same service
that filebeat should be trying to use as per the config.
We just tested with 7.13, the same; clear text password works, but keytab doesn't.
@elafontaine Could you share your filebeat.yml?
Same problem with 7.15. Searching for a link to a solution.
2021-10-09T17:10:45.739+0800 INFO [file_watcher] filestream/fswatch.go:137 Start next scan
2021-10-09T17:10:50.838+0800 ERROR [kafka] kafka/client.go:317 Kafka (topic=common_log_topic): kafka: client has run out of available brokers to talk to (Is your cluster reachable?)```
@JunTaoYuan80 jun Have you solved it now? I have the same problem
@JunTaoYuan80 jun Have you solved it now? I have the same problem
no, but i change filebeat to logstash, it's ok.
Pinging @elastic/elastic-agent-data-plane (Team:Elastic-Agent-Data-Plane)
@rdner @faec does this issue rings a bell on your end?
Our latest documentation claims Kerberos is supported:
To use GSSAPI mechanism to authenticate with Kerberos, you must leave this field empty, and use the kerberos options.
https://www.elastic.co/guide/en/beats/filebeat/current/kafka-output.html#_sasl_mechanism
However, we still don't have an integration test to track that it's working https://github.com/elastic/beats/issues/29430
Hi guys, I saw Beats Library doesn’t support Kerberos authentication for kafka output, but Logstash kafka input does. Any plan to add this kind of Auth?
Thanks.