Open thiagoananias opened 3 years ago
I solved my issue
when creating the schema, for example:
const schema = await readAVSCAsync('avro/car.avsc'); const { id } = await registry.register(schema, { subject: 'car-value' });
I used the subject value to override the default behavior, which is concatenate the namespace and name of the avro.
But i didn't get exaclty why... is this a pattern on Confluent Kafka?
Since no one bothered to answer this - Confluent provide 3 ways to map from a topic to a schema registry topic. You MUST adhere to one of their 3 strategies since confluent tools such as ksqlDB and connect must be able to resolve the schema for a given topic.
The default is to use the name of the topic as the subject. (TopicNameStrategy) You can also use RecordNameStrategy (uses the name of the message type - namespace.name for avro, the import URL for proto) Last is TopicRecordNameStrategy, which combines the topic name with the record name like {topic}-{record name}.
All confluent tools allow you to specify which strategy is in use for a given topic. Details are here: https://docs.confluent.io/platform/current/schema-registry/serdes-develop/index.html#sr-schemas-subject-name-strategy
Hello!
I'am using Kafka on Kubernetes using the helm chart
I'am using this code to insert the data on the Cluster, very simple
The producer
The client
the avro
The thing is... all works fine until i have to use Kafka Connect and ksql
The Kafka connect Problem: i have managed to create a connector to get the messages and send to MongoDB, but why i need the 'value.converter.value.subject.name.strategy' property to work??
The ksql problem are two... first i can't create a simple stream, the following erros occurs
When i list the subjects from the schema-registry the is no 'customer-value' registered! Is this a problem??
Even after a add it manually it gives me the error on the ksql server when i run a query
WARN stream-thread [_confluent-ksql-kafka-confluenttransient_5604733824524432010_1610479904778-81cfb00c-5d81-41f4-93ff-59d801a57908-StreamThread-1] task [0_0] Skipping record due to deserialization error. topic=[customer] partition=[0] offset=[4] (org.apache.kafka.streams.processor.internals.RecordDeserializer) org.apache.kafka.common.errors.SerializationException: Error deserializing message from topic: customer Caused by: org.apache.kafka.connect.errors.DataException: Failed to deserialize data for topic customer to Avro:
Thanks for the help