Open SamiShaikh opened 1 year ago
This is a known documented limitation - https://github.com/confluentinc/ksql/pull/10036/files
The docs seem unrelated. The documented limitation is with json schema. This ticket is for avro
do we have any updates on this topic? KSQL is unable to convert the stringified enum field back to the avro serialized index.
Using a CSAS from a stream having source and target the same avro schema inference using enums. The original avro enum is converted to string in ksqldb. KsqlDB is able to read the avro enums, but is not able to write them back to a topic.
Original enum (part of a namespace)
{ "name": "eventName", "type": { "type": "enum", "name": "EventName", "doc": "Event name", "symbols": [ "MY_DEFAULT_VALUE" ], "default": "MY_DEFAULT_VALUE" }, "doc": "Event name" },
Resulting enum in ksqlDB
{ "name": "eventName", "type": { "type": "string", "connect.parameters": { "io.confluent.connect.avro.enum.doc.EventName": "Event name", "io.confluent.connect.avro.enum.default.EventName": "MY_DEFAULT_VALUE", "io.confluent.connect.avro.Enum": "my.namespace.EventName", "io.confluent.connect.avro.Enum.MY_DEFAULT_VALUE": "MY_DEFAULT_VALUE" }, "connect.name": "my.namespace.EventName" }, "doc": "Event name" }
Error:
Caused by: java.io.IOException: Incompatible schema {....} with refs [] of type AVRO for schema {...} Set id.compatibility.strict=false to disable this check at io.confluent.kafka.serializers.AbstractKafkaSchemaSerDe.lookupSchemaBySubjectAndId(AbstractKafkaSchemaSerDe.java:544) at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:130)
Describe the bug Schema Inference With ID in CSAS does not work for enums.
To Reproduce To reproduce Register below schema for topic avroTest4
Produce a record. Create streams below
select from avroTest5 emit changes;
Expected behavior The produced record should be emitted
Actual behaviour Throws error during serialisation
Additional context The issue occurs even if you replace CSAS with CREATE and INSERT