confluentinc / librdkafka

The Apache Kafka C/C++ library
Other
253 stars 3.15k forks source link

Local: Host resolution failure: (errno: Bad address) #2687

Closed Akshaynada closed 4 years ago

Akshaynada commented 4 years ago

Read the FAQ first: https://github.com/edenhill/librdkafka/wiki/FAQ

Description

Hi , I am trying to create a producer that connects to a KAFKA broker. This is a SSL connection with all the certs supplied.

RDKAFKA-7-BROKERFAIL: rdkafka#producer-3: [thrd:ssl://10.0.0.1:9092/bootstrap]: ssl://10.0.0.1:9092/bootstrap: failed: err: Local: Host resolution failure: (errno: Bad address) nfm_trace_dump.txt:[2020 Jan 15 18:54:32.295180265:7096:P:logger:1486] RDKAFKA-3-FAIL: rdkafka#producer-3: [thrd:ssl://10.0.0.1:9092/bootstrap]: ssl://10.0.0.1:9092/bootstrap: Failed to resolve '10.0.0.1:9092': invalid flags value

How to reproduce

  1. Created a conf and updated the parameters //Setup the dcos_context snprintf(vrf_str, sizeof(vrf_str), "%d", 4); if (setenv("DCOS_CONTEXT", vrf_str, 1) == -1)

rd_kafka_conf_set(conf, "bootstrap.servers","10.0.0.1", rd_kafka_conf_set(conf, "security.protocol", "SSL", errstr, sizeof(errstr)) rd_kafka_conf_set(conf,"ssl.certificate.location", "/client/kafka/KafkaClient.crt",errstr, sizeof(errstr)) rd_kafka_conf_set(conf, "ssl.key.location", "/client/kafka/KafkaClient.key",errstr, sizeof(errstr)) rd_kafka_conf_set(conf, "ssl.ca.location", "/client/kafka/Ca.crt"errstr, sizeof(errstr))

  1. rd_conf_new with above configs were successful
  2. Created a topic using rd_kafka_topic_new which was successful

rkt = rd_kafka_topic_new(rk, "topic_test", NULL);

  1. Started the rd_kafka poll after the creation of the producer rd_kafka_poll(rk, 1000/block for max 1000ms/)

  2. Got the following logs RDKAFKA-7-SSL: rdkafka#producer-1: [thrd:app]: Loading CA certificate(s) from file /client/kaf ka/Ca.crt RDKAFKA-7-SSL: rdkafka#producer-1: [thrd:app]: Loading certificate from file /client/kafka/KafkaClient.crt RDKAFKA-7-SSL: rdkafka#producer-1: [thrd:app]: Loading private key file from /client/kafka/KafkaClient.key RDKAFKA-7-BRKMAIN: rdkafka#producer-1: [thrd::0/internal]: :0/internal: Enter main broke r thread RDKAFKA-7-STATE: rdkafka#producer-1: [thrd::0/internal]: :0/internal: Broker changed sta te INIT -> UP RDKAFKA-7-WAKEUPFD: rdkafka#producer-1: [thrd:app]: ssl://10.0.0.1:9092/bootstrap: Enabl ed low-latency partition queue wake-ups RDKAFKA-7-BROADCAST: rdkafka#producer-1: [thrd::0/internal]: Broadcasting state change RDKAFKA-7-WAKEUPFD: rdkafka#producer-1: [thrd:app]: ssl://10.0.0.1:9092/bootstrap: Enabl ed low-latency ops queue wake-ups RDKAFKA-7-BROKER: rdkafka#producer-1: [thrd:app]: ssl://10.0.0.1:9092/bootstrap: Added n ew broker with NodeId -1 [KAFKA] Created kafka producer RDKAFKA-7-TOPIC: rdkafka#producer-1: [thrd:app]: New local topic: topic_test RDKAFKA-7-BRKMAIN: rdkafka#producer-1: [thrd:ssl://10.0.0.1:9092/bootstrap]: ssl://10.0. 0.1:9092/bootstrap: Enter main broker thread RDKAFKA-7-CONNECT: rdkafka#producer-1: [thrd:ssl://10.0.0.1:9092/bootstrap]: ssl://10.0. 0.1:9092/bootstrap: broker in state INIT connecting RDKAFKA-7-TOPPARNEW: rdkafka#producer-1: [thrd:app]: NEW topic_test [-1] 0x1183cdbc (at rd_ka fka_topic_new0:282) RDKAFKA-7-METADATA: rdkafka#producer-1: [thrd:app]: Skipping metadata refresh of 1 topic (s): no usable brokers Created kafka topic Created producer RDKAFKA-7-BROKERFAIL: rdkafka#producer-1: [thrd:ssl://10.0.0.1:9092/bootstrap]: ssl://10 .0.0.1:9092/bootstrap: failed: err: Local: Host resolution failure: (errno: Bad address) RDKAFKA-3-FAIL: rdkafka#producer-1: [thrd:ssl://10.0.0.1:9092/bootstrap]: ssl://10.0.0.1 :9092/bootstrap: Failed to resolve '10.0.0.1:9092': invalid flags value RDKAFKA-3-ERROR: rdkafka#producer-1: [thrd:ssl://10.0.0.1:9092/bootstrap]: ssl://10.0.0. 1:9092/bootstrap: Failed to resolve '10.0.0.1:9092': invalid flags value

  3. The certs are all present and the broker is accessible via openssl DCOS_CONTEXT=4 openssl s_client -CAfile /client/kafka/Ca.crt -connect 10.0.0.1:9092 -prexit -cert /client/kafka/KafkaClient.crt -key /client/kafka/KafkaClient8.key -debug -state -tls1 -msg

SSL handshake has read 2493 bytes and written 2487 bytes

IMPORTANT: Always try to reproduce the issue on the latest released version (see https://github.com/edenhill/librdkafka/releases), if it can't be reproduced on the latest version the issue has been fixed.

Checklist

IMPORTANT: We will close issues where the checklist has not been completed.

Please provide the following information:

edenhill commented 4 years ago

Seems like a portability problem with getaddrinfo() and NXOS, I don't have access to an NXOS platform so I can't really troubleshoot this further, but I suggest you read the manual page for getaddrinfo() for NXOS and compare it to the code in rdaddr.c to see if there are any obvious incompatibilities or unsupported flags.

Also note that librdkafka 0.9.1 is almost 4 years and completely unsupported, you should upgrade to the latest version of librdkafka.