lagom / lagom

Reactive Microservices for the JVM
https://www.lagomframework.com
Apache License 2.0
2.63k stars 634 forks source link

Support for publishing a WebSocket stream directly to Kafka #659

Open jroper opened 7 years ago

jroper commented 7 years ago

If no side effects are being done on the processing of a stream that will be published to Kafka, then it should be safe for Lagom to publish that stream directly to Kafka without any tracking of unpublished messages. An example use case was posted on stack overflow.

aklikic commented 7 years ago

+1

jroper commented 7 years ago

@aklikic do have any thoughts on what you'd imagine such an API to look like?

aklikic commented 7 years ago

@jroper I have a use case where device location is shared and has to be published to kafka using "at most once" semantic. In this case persistent entity and entity streams are an overhead so I would like to use an API like in PubSub. TopicProducer.publish(topic, message)

ignasi35 commented 7 years ago

The Topic where data is published should be part of the Service Descriptor since other services should be able to subscribe to it transparently.

Then, a ServiceImpl could implement the topic method with something like:

TopicProducer.connect(queue) ( (InputMessage) => Source[OutputMessage, Any])

This way, a singleton queue (an instance variable in the service impl) would be available to all API methods to push InputMessage. The code in the Topic construction (see snippet above) would consume that in-mem queue and adapt incoming messages to the format expected in the kafka topic.

If the service crashed all data in the queue would be lost (future implementations could have persistent queues and/or rely on pubsub for load balancing, etc...).