AliwareMQ / aliware-kafka-demos

提供各种客户端接入阿里云 消息队列 Kafka 的demo工程(Provide a demo project for various clients to access Alibaba Cloud message queue Kafka)
https://www.aliyun.com/product/kafka
408 stars 214 forks source link

Java demo一直报org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 30000 ms. #20

Open huiz6 opened 6 years ago

huiz6 commented 6 years ago

具体错误如下: java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 30000 ms. at org.apache.kafka.clients.producer.KafkaProducer$FutureFailure.(KafkaProducer.java:730) at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:483) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:430) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:353) at com.aliyun.openservices.kafka.ons.KafkaProducerDemo.main(KafkaProducerDemo.java:53) Caused by: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 30000 ms. 客户端版本用的是推荐版本:

org.apache.kafka
        <artifactId>kafka-clients</artifactId>
        <version>0.10.0.0</version>
        <exclusions>
            <exclusion>
                <groupId>org.slf4j</groupId>
                <artifactId>slf4j-api</artifactId>
            </exclusion>
        </exclusions>
felayman commented 6 years ago

我也一直在报这个错误

felayman commented 6 years ago

日志内容为: [2018-06-26 20:50:17,098] INFO ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka-cn-internet.aliyun.com:8080] ssl.keystore.type = JKS sasl.mechanism = ONS max.block.ms = 3000 interceptor.classes = null ssl.truststore.password = [hidden] client.id = ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = 1 receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = E:\config\kafka.client.truststore.jks ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = SASL_SSL max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 0 (org.apache.kafka.clients.producer.ProducerConfig) [2018-06-26 20:50:17,688] INFO Successfully logged in. (org.apache.kafka.common.security.authenticator.AbstractLogin) [2018-06-26 20:50:18,645] INFO ProducerConfig values: metric.reporters = [] metadata.max.age.ms = 300000 reconnect.backoff.ms = 50 sasl.kerberos.ticket.renew.window.factor = 0.8 bootstrap.servers = [kafka-cn-internet.aliyun.com:8080] ssl.keystore.type = JKS sasl.mechanism = ONS max.block.ms = 3000 interceptor.classes = null ssl.truststore.password = [hidden] client.id = producer-1 ssl.endpoint.identification.algorithm = null request.timeout.ms = 30000 acks = 1 receive.buffer.bytes = 32768 ssl.truststore.type = JKS retries = 0 ssl.truststore.location = E:\config\kafka.client.truststore.jks ssl.keystore.password = null send.buffer.bytes = 131072 compression.type = none metadata.fetch.timeout.ms = 60000 retry.backoff.ms = 100 sasl.kerberos.kinit.cmd = /usr/bin/kinit buffer.memory = 33554432 timeout.ms = 30000 key.serializer = class org.apache.kafka.common.serialization.StringSerializer sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 ssl.trustmanager.algorithm = PKIX block.on.buffer.full = false ssl.key.password = null sasl.kerberos.min.time.before.relogin = 60000 connections.max.idle.ms = 540000 max.in.flight.requests.per.connection = 5 metrics.num.samples = 2 ssl.protocol = TLS ssl.provider = null ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1] batch.size = 16384 ssl.keystore.location = null ssl.cipher.suites = null security.protocol = SASL_SSL max.request.size = 1048576 value.serializer = class org.apache.kafka.common.serialization.StringSerializer ssl.keymanager.algorithm = SunX509 metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner linger.ms = 0 (org.apache.kafka.clients.producer.ProducerConfig) [2018-06-26 20:50:18,648] INFO Kafka version : 0.10.0.0 (org.apache.kafka.common.utils.AppInfoParser) [2018-06-26 20:50:18,648] INFO Kafka commitId : b8642491e78c5a13 (org.apache.kafka.common.utils.AppInfoParser) error occurred java.util.concurrent.ExecutionException: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 3000 ms. at org.apache.kafka.clients.producer.KafkaProducer$FutureFailure.(KafkaProducer.java:730) at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:483) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:430) at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:353) at kafka.KafkaProducerDemo.main(KafkaProducerDemo.java:57) Caused by: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 3000 ms.