Open cdmikechen opened 2 years ago
maybe you can try using the confluent avro converter like in this example?
We have a similar question related to Avro events consumed by StreamReactor MQTT source (latest version):
AvroConverter
, a new bytes-only schema is created in Schema Registry (no field inference). AvroConverter
, it is not possible to pass a Schema Registry URL through avro.schemas
, indeed $path
must be a valid java.io.File
e.g. /path/to/schema.avsc
otherwise a ConfigException
is thrown by this code.Would it be reasonable and valuable to allow Schema Registry URLs as well? Since the PR will affect all connectors (kafka-connect-common
), is there any performance or security concern?
Issue Guidelines
Please review these questions before submitting any issue?
What version of the Stream Reactor are you reporting this issue for?
3.0.1
Are you running the correct version of Kafka/Confluent for the Stream reactor release?
Yes
Do you have a supported version of the data source/sink .i.e Cassandra 3.0.9?
Have you read the docs?
https://docs.lenses.io/5.0/integrations/connectors/stream-reactor/sources/mqttsourceconnector/
What is the expected behaviour?
I use JSON by default for kafka data storage. I've found that this project can support
AvroConverter
and I also found that we can useconnect.converter.avro.schemas
to specify avro file. But when pushing data to kafka as a source connector, we should support the schema registry mode and register the avro schema to the schema registry service.What was observed?
Except for some descriptions in kudu, I didn't find how to configure it in other places.
What is your Connect cluster configuration (connect-avro-distributed.properties)?
What is your connector properties configuration (my-connector.properties)?
Please provide full log files (redact and sensitive information)
No