Open hth4 opened 3 years ago
What is your CONFLUENT_HOME
environment variable? echo $CONFLUENT_HOME
In what directory (full path) did you install Confluent? Is it a directory only accessible to a root user (and does kafka-avro-console-producer
have those permissions)?
I installed my testing installation as root
CONFLUENT_HOME=/root/confluent-6.1.1 PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin://root/confluent-6.1.1/bin:/root/confluent-hub/bin://root/confluent-cli
-rwxr-xr-x. 1 root root 1545 Mar 17 01:20 /root/confluent-6.1.1/bin/kafka-avro-console-producer
I think the problem is in your PATH
, specifically:
//root/confluent-6.1.1/bin:/root/confluent-hub/bin://root/confluent-cli
//
should be /
. Due to such specification of path, schema-registry-run-class
script ran by the kafka-avro-console-producer
incorrectly builds a path:
https://github.com/confluentinc/schema-registry/blob/master/bin/schema-registry-run-class#L68
It tries to connect to file://root/confluent-6.1.1/bin/../etc/schema-registry/log4j.properties
- invalid URI, instead of file:/root/confluent-6.1.1/bin/../etc/schema-registry/log4j.properties
- valid URI. If you fix your PATH
, it should start correctly (assuming kafka-avro-console-producer
is started as root).
you are right! After fixing the path this error gone away. Thanks! Now I get:
[2021-05-10 17:37:12,527] ERROR Could not parse Avro schema (io.confluent.kafka.schemaregistry.avro.AvroSchemaProvider:57)
org.apache.avro.SchemaParseException: com.fasterxml.jackson.core.JsonParseException: Unexpected character ('n' (code 110)): was expecting double-quote to start field name
at [Source: (String)"{"type":"record",name":"key_schema","fields":[{"name":"id","type":"int"}]}"; line: 1, column: 19]
at org.apache.avro.Schema$Parser.parse(Schema.java:1396)
at org.apache.avro.Schema$Parser.parse(Schema.java:1382)
at io.confluent.kafka.schemaregistry.avro.AvroSchema.
Did I made another stupid mistake?
found the typo myself ...
'{"type":"record",name": ... '{"type":"record","name":...
many thanks for the fast response!
Can we close the issue now, @hth4?
While following the installing instructions I run into the following error with the kafka-avro-console-producer:
kafka-avro-console-producer --broker-list localhost:9092 --topic example --property parse.key=true --property key.schema='{"type":"record",name":"key_schema","fields":[{"name":"id","type":"int"}]}' --property "key.separator=$" --property value.schema='{"type":"record","name":"value_schema","fields":[{"name":"id","type":"int"},{"name":"firstName","type":"string"},{"name":"lastName","type":"string"}]}'
log4j:ERROR Could not read configuration file from URL [file://root/confluent-6.1.1/bin/../etc/schema-registry/log4j.properties]. java.net.UnknownHostException: root at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:184) at java.net.Socket.connect(Socket.java:607) at sun.net.ftp.impl.FtpClient.doConnect(FtpClient.java:962) at sun.net.ftp.impl.FtpClient.tryConnect(FtpClient.java:924) at sun.net.ftp.impl.FtpClient.connect(FtpClient.java:1019) at sun.net.ftp.impl.FtpClient.connect(FtpClient.java:1005) at sun.net.www.protocol.ftp.FtpURLConnection.connect(FtpURLConnection.java:311) at sun.net.www.protocol.ftp.FtpURLConnection.getInputStream(FtpURLConnection.java:417) at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:557) at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526) at org.apache.log4j.LogManager.(LogManager.java:127)
at org.slf4j.impl.Log4jLoggerFactory.(Log4jLoggerFactory.java:66)
at org.slf4j.impl.StaticLoggerBinder.(StaticLoggerBinder.java:72)
at org.slf4j.impl.StaticLoggerBinder.(StaticLoggerBinder.java:45)
at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:417)
at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:362)
at com.typesafe.scalalogging.Logger$.apply(Logger.scala:48)
at kafka.utils.Log4jControllerRegistration$.(Logging.scala:25)
at kafka.utils.CommandLineUtils$.(CommandLineUtils.scala:28)
at kafka.tools.ConsoleProducer$ProducerConfig.(ConsoleProducer.scala:228)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:41)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
log4j:ERROR Ignoring configuration file [file://root/confluent-6.1.1/bin/../etc/schema-registry/log4j.properties].
log4j:WARN No appenders could be found for logger (kafka.utils.Log4jControllerRegistration$).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
java.util.NoSuchElementException: No value present
at java.util.Optional.get(Optional.java:135)
at io.confluent.kafka.formatter.SchemaMessageReader.parseSchema(SchemaMessageReader.java:212)
at io.confluent.kafka.formatter.SchemaMessageReader.getSchema(SchemaMessageReader.java:224)
at io.confluent.kafka.formatter.SchemaMessageReader.init(SchemaMessageReader.java:158)
at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:43)
at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Is there more to install then copying the connector jar files?