Open edmondop opened 6 years ago
An interesting article describing why one wants to implement this pattern is found on Confluent blog here: https://www.confluent.io/blog/leveraging-power-database-unbundled/ in particular in the section Example:Building an Embedded, Materialised Views using the Kafka Streams API
. I also understand this might be more of a question for an Akka Streams - Kafka repository
I agree with @edmondo1984 - this is a very important need.
@edmondo1984 can you please elaborate on what did you exactly mean by "write back to Kafka stream the join result" and why it is not ideal? In Java, can't that be done in a reactive manner?
Consuming Kafka-Streams join in reactive fashion
A typical use case when dealing with microservices is to build persistent views where you reconstruct locally a cache of other microservices data, typically by joining Kafka streams. Then, when a record in this stream gets updated, one can reactively invoke a service to perform some tasks.
Lagom however today allows to consume a single stream in reactive fashion, using Akka Streams underneath and typically building on top of the existing API. Kafka Streams on its side instead requires synchronous consuming operation
(peek/foreach)
At the moment, it appears that two possible approaches are : use blocking (argh!) as in the following snippet or write back to kafka stream the join result
None of the solution is obviously fully satisfactory
Lagom Version (1.2.x / 1.3.x / etc)
All
API (Scala / Java / Neither / Both)
Both