Closed adaminsta closed 8 months ago
@adaminsta thanks for suggesting, I think this makes sense, equivalent to kafka.
@adaminsta have a look here: https://github.com/datacontract/datacontract-specification/pull/33/files
Looks great!
Do you have the need for testing the messages in pub/sub? Do you have any suggestions what engines could help here?
We are planning to consume messages from pubsub with dataflow and then merge the data into snowflake tables. My initial idea was to validate format of the input in the dataflow job by comparing with the contract yaml.
An even more powerful solution would be if we can genereate a pubsub schema (only avro/protocol buffer) from the contract yaml, which I can attach to the pubsub topic (https://cloud.google.com/pubsub/docs/schemas)
We're planning on such export functionality. See #56 and #57 Feel free to provide us some examples (data contract -> avro schema, data contract -> protobuf) there to help us drive the implementation.
We are consuming data from a pub-sub topic and putting it into snowflake, but there is no pubsub option for servers. Need something like this: