elodina / scala-kafka

Quick up and running using Scala for Apache Kafka
Apache License 2.0
331 stars 135 forks source link

Reconnect to the server failed. #32

Open wpoosanguansit opened 9 years ago

wpoosanguansit commented 9 years ago

Hi, I am trying to reinit KafkaProducer in a try catch block. However, I do get this error:

A38 OK SEARCH completed (Success) 2015-05-22 14:54:53 INFO ClientUtils$:68 - Fetching metadata from broker id:0,host:localhost,port:9092 with correlation id 30 for 1 topic(s) Set(test) 2015-05-22 14:54:53 INFO SyncProducer:68 - Connected to localhost:9092 for producing 2015-05-22 14:54:53 INFO SyncProducer:68 - Disconnecting from localhost:9092 2015-05-22 14:54:53 WARN ClientUtils$:89 - Fetching topic metadata with correlation id 30 for topics [Set(test)] from broker [id:0,host:localhost,port:9092] failed java.nio.channels.ClosedChannelException at kafka.network.BlockingChannel.send(BlockingChannel.scala:100) at kafka.producer.SyncProducer.liftedTree1$1(SyncProducer.scala:73) at kafka.producer.SyncProducer.kafka$producer$SyncProducer$$doSend(SyncProducer.scala:72) at kafka.producer.SyncProducer.send(SyncProducer.scala:113) at kafka.client.ClientUtils$.fetchTopicMetadata(ClientUtils.scala:58) at kafka.producer.BrokerPartitionInfo.updateInfo(BrokerPartitionInfo.scala:82) at kafka.producer.BrokerPartitionInfo.getBrokerPartitionInfo(BrokerPartitionInfo.scala:49) at kafka.producer.async.DefaultEventHandler.kafka$producer$async$DefaultEventHandler$$getPartitionListForTopic(DefaultEventHandler.scala:186) at kafka.producer.async.DefaultEventHandler$$anonfun$partitionAndCollate$1.apply(DefaultEventHandler.scala:150) at kafka.producer.async.DefaultEventHandler$$anonfun$partitionAndCollate$1.apply(DefaultEventHandler.scala:149) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) at kafka.producer.async.DefaultEventHandler.partitionAndCollate(DefaultEventHandler.scala:149) at kafka.producer.async.DefaultEventHandler.dispatchSerializedData(DefaultEventHandler.scala:95) at kafka.producer.async.DefaultEventHandler.handle(DefaultEventHandler.scala:72) at kafka.producer.Producer.send(Producer.scala:77) at kafka.producer.KafkaProducer.send(KafkaProducer.scala:108) at kafka.producer.KafkaProducer.send(KafkaProducer.scala:104)

Is there a way to reconnect back to the Queue after the reinit? Thanks for your help.

sallum commented 9 years ago

I am experiencing the same error now... did you manage to find a solution to it?

wpoosanguansit commented 9 years ago

Unfortunately not. I ended up not using the library.

On Jul 22, 2015, at 5:42 AM, Ignacio Mulas notifications@github.com wrote:

I am experiencing the same error now... did you manage to find a solution to it?

— Reply to this email directly or view it on GitHub.

tantonyan commented 9 years ago

I am seeing the same error when producing to any of my topics: Fetching topic metadata with correlation id 0 for topics [Set(my-topic)] from broker [id:0,host:localhost,port:9092] failed (kafka.client.ClientUtils$) Producing used to work fine and I had a stream reading script (in python) running that was adding messages last night -- this morning the script was no longer running and I can not manually add any messages either.

I can still consume messages. If anyone found a solution or knows the cause of the issue, please post.

matanox commented 8 years ago

does this issue still persist?

badrishdavey commented 7 years ago

did you find a solution for this ? I am seeing the same issue when using the kerberos based SSL setup and passing in the information

wpoosanguansit commented 7 years ago

unfortunately not. we ended up not using the lib.

On Dec 1, 2016, at 3:11 PM, badrishdavey notifications@github.com wrote:

did you find a solution for this ? I am seeing the same issue when using the kerberos based SSL setup and passing in the information

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.