Open ram-venket36 opened 2 months ago
@ram-venket36 I notice Apicurio schema registry is compatible with Confluent Schema registry mentioned here https://www.apicur.io/registry/docs/apicurio-registry/2.6.x/getting-started/assembly-intro-to-the-registry.html. So I assume this connector should also work for Apicurio.
If you are facing any issues, pls post the error here, we can take a look. And post the full stack trace for unknown magic byte.
@muralibasani
Instead of Confluent AvroConverter Jar, I updated connector package with Apicurio Avro Converter jar(apicurio-registry-utils-converter) and referred that class in S3 config which worked.
To my understanding Confluent AvroConverter is looking for Magic Byte in specific format (0 * 0) and since the events are serialized by ApicurioSerializer from producer side magic number in different format which was causing issue.
@ram-venket36 as long as you have same Apicurio SerDes on producer and consumer side, that should work fine. So I assume that's not the case here, and hence you are getting this issue., and this is expected behaviour I believe.
Hi
I have below setup
Event Streams(Apache kafka platform) Apicurio schema registry Java spring custom app that produce avro serialized events using Apicurio SerDes library Aiven S3 sink connector configure to that topic.
With this implementation I am getting error as SerializationException: Unknown Magic byte when the connector tries to store the events in S3. With Java custom consumer app I am able to deserialize(Apicurio Serdes library) those events successfully but only via this connector getting issue.
Does this connector work with Apicurio schema registry or only with Confluent scheme registry ? Also events serialized by Apicurio SerDes library can be converted by Confluent AvroConverter ?
Value.converter.schema.registry.url : 'My apicurio schema registry url' value.converter: io.confluent.connect.avro.AvroConverter value.converter.use.latest.version: true value.converter.schema.enable: true value.converter.specific.avro.reader:true value.converter.enhanced.avro.schema.support: true topic : topic name Key.converter: ....Storage.StringConverter Connector.Class:...... AivenKafkaConnectS3SinkConnector
Apart from this I have given default S3 bucket, access details and broker URL.
Thanks