confluentinc / schema-registry

Confluent Schema Registry for Kafka
https://docs.confluent.io/current/schema-registry/docs/index.html
Other
2.16k stars 1.11k forks source link

[doc] Create advanced schema #495

Open treziac opened 7 years ago

treziac commented 7 years ago

The doc is missing a section to explain how to create a more elaborate schema than just a one-string-field schema. How should I create a schema with say, a timestamp or any other logical type i would like to be used with kafka connect? Manually, using a SchemaBuilder, or with Strongly typed class and schema generation?

If using the SchemaBuilder, is there any specs about this (all the specific logical type/name/property to include in the schema like connect.name, org.apache.kafka.connect.data.Decimal, org.apache.kafka.connect.data.Timestamp...) I'm using C# and plan to contribute for an avro serializer on the new confluent C# client (https://github.com/confluentinc/confluent-kafka-dotnet/issues/67)

lxghtless commented 7 years ago

Bump.

To add to this question, I would love examples of how to publish a complex type. Here's an avro schema example that works perfectly within a Kafka Streams processor, but is rejected with a 500 error from the schema registry. It is possible to basically embed the address and item type schemas into the order schema, resulting in a large, redundant schema - but that generates different types for each property on the Java end of things... not a maintainable thing for more complicated types.

[ { "namespace": "io.wakurth.examples.streams.avro", "type": "record", "name": "Address", "fields": [ { "name": "Address1", "type": "string" }, { "name": "City", "type": "string" }, { "name": "State", "type": "string" }, { "name": "PostalCode", "type": "string" } ] }, { "namespace": "io.wakurth.examples.streams.avro", "type": "record", "name": "Item", "fields": [ { "name": "ProductCode", "type": "string" }, { "name": "UPC", "type": "string" }, { "name": "SKU", "type": "string" }, { "name": "ProductName", "type": "string" } ] }, { "namespace": "io.wakurth.examples.streams.avro", "type": "record", "name": "Order", "fields": [{ "name": "CustomerCode", "type": "string" }, { "name": "ExternalReferenceNumber", "type": "string" }, { "name": "Comment", "type": "string" }, { "name": "Pick", "type": "Address" }, { "name": "Drop", "type": "Address" }, { "name": "Items", "type": { "type": "array", "items": "Item" } }] } ]

OneCricketeer commented 2 years ago

I feel like this issue is more suited for either the confluent-kafka-dotnet project for the C# problem (where there have been previous issue comments saying that Avro w/ C# isn't recommended anymore, in favor of protobuf), or the Avro community for actually creating classes from schemas

For the 500 error, the actual errors will need to be provided.