A common use case for Kafka is to take in one topic and output a different one, either filtering (use a different topic, same schema), mapping (different schema) or both.
We should create a class that acts as both a producer and consumer, which can provide not only its own schema but also the "target" topic and schema. The API should include a include_payload? method as well as a generate_payload method.
For the mapping case, the class should be smart enough to map the source to the target payload based on the schemas, meaning we could theoretically make a mapper that needs zero code.
A common use case for Kafka is to take in one topic and output a different one, either filtering (use a different topic, same schema), mapping (different schema) or both.
We should create a class that acts as both a producer and consumer, which can provide not only its own schema but also the "target" topic and schema. The API should include a
include_payload?
method as well as agenerate_payload
method.For the mapping case, the class should be smart enough to map the source to the target payload based on the schemas, meaning we could theoretically make a mapper that needs zero code.