We have a spark streaming Kafka consumer job reads the data with the help of schema registry and schema-registry is running inside docker container, but there are some occasional scenarios where spark streaming kafka consumer fails with SchemaId not found for the schema specified
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema not found;
Caused by: org.apache.kafka.common.errors.SerializationException: Error retrieving Avro schema for id 8
This happens after the exact restart of the schema-registry docker container and schemaId for the same schema was updated after the restart.
One more thing is we were not able to reproduce the same issue after every restart. Have anyone faced this issue earlier? Is it possible to specify schema Id or anyother possible options ?
We have a spark streaming Kafka consumer job reads the data with the help of schema registry and schema-registry is running inside docker container, but there are some occasional scenarios where spark streaming kafka consumer fails with
SchemaId not found for the schema specified
This happens after the exact restart of the schema-registry docker container and schemaId for the same schema was updated after the restart.
One more thing is we were not able to reproduce the same issue after every restart. Have anyone faced this issue earlier? Is it possible to specify schema Id or anyother possible options ?
As I have seen from here(https://github.com/confluentinc/schema-registry/issues/878) schema Id is deterministic. any other options ?
Thanks