Closed jbkc85 closed 7 years ago
If you're in multiple data centers, you can follow our docs on a multi-dc setup which includes replicating the __schemas
topic for DR scenarios.
If you're only in one DC, there's also a really simple hack: you can just use console consumer to back up the data and console producer to restore it. You'll need to customize the arguments for your setup, but you could use something like this command to dump the schemas:
./bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic _schemas --from-beginning --property print.key=true --timeout-ms 1000 1> schemas.log
And this one to restore them:
./bin/kafka-console-producer.sh --broker-list localhost:9092 --topic _schemas --property parse.key=true < schemas.log
Since schemas aren't that large and there usually aren't a very large number of schemas, this should just take a couple of seconds for either command.
thanks for the answer! seems to be easy enough =).
I made a gist that uses Docker so you don't need to install Kafka specific packages on a machine (or Java)
https://gist.github.com/cricket007/a12b4d9b26f0f2df4bee10cdc9c16d5d
Anyone;
I am fairly new to the schema-registry (and frankly everything Kafka) and was wondering if there is a specific method or best practice documented anywhere on how to backup and potentially restore schemas stored in the schema-registry. Basically my worry is that in a DR scenario, the ID's for the schemas are lost, how do we recover the data and/or applications looking for said ID's?
Thanks ahead of time!