Closed patpatpat123 closed 3 years ago
With riff 0.0.x
we used Kafka to pull messages to the function for invocation. With riff 0.1
we use HTTP to push messages to the function for invocation. In this new model, it's still architecturally possible to use Kafka as the driver for functions invocations. What you will need (and doesn't exist yet) is an event Source that reads from Kafka. In this model, your function shouldn't need to know anything about Kafka (or HTTP for that matter) in order to receive messages from a Kafka topic.
The particulars around Sources are very much inflight and not ready to be implemented quite yet, But I do expect to see a Kafka based Source be implemented, similar to how we already have a Kafka based Bus implementation for Channels.
I hope that helps.
Hello Scott,
Thank you a lot for your clear answer. Will it be possible to have a roadmap for which version will this be possible? Would it be possible to use a workaround based on Spring Cloud Stream?
@Bean @Input(NUMBERONE) SubscribableChannel binding1();
@Bean
@Input(NUMBERTWO)
SubscribableChannel binding2();
@Bean
@Output(NUMBERTHREE)
MessageChannel singleOutput();
@Bean
@Output(NUMBERFOUR)
MessageChannel anotherOutput();
@Override
@StreamListener(MultipleProcessor.NUMBERONE)
@SendTo(MultipleProcessor.NUMBERTHREE)
public Flux
@Override
@StreamListener(MultipleProcessor.NUMBERTWO)
@SendTo(MultipleProcessor.NUMBERFOUR)
public Flux<Bar> apply(Flux<Foo> fooFlux) {
return fooFlux.map(oneFoo -> new Bar(oneFoo.getSomeFoo().replace("hello", "world")));
}
spring.cloud.stream.kafka.binder.brokers=knative-eventing:9092 spring.cloud.stream.kafka.binder.zkNodes=knative-eventing:2181
String topicName = "numberone"; kafkaParams.put("bootstrap.servers", "knative-eventing:9092"); producer.send(new ProducerRecord<>(topicName,"key", "{\"someFoo\":\"hello\"}"));
String topic = "numberthree"; props.put("bootstrap.servers", "knative-eventing:9092"); KafkaConsumer<String, String> consumer = new KafkaConsumer<String, String>(props); consumer.subscribe(Arrays.asList(topic));
I believe a Kafka/Rabbit based trigger that will still work with riff can be a good idea. Thank you again for your presentation with Eric.
I am also investigating on the same. My startup company has some upstreams systems that just put some messages into a common Kafka cluster with specific topics. They do not wait for any responses and do not care about who consume the messages.
We are currently having some ad hoc consumers that consume the messages and send the transformed payload into some other downstream systems.
Right now, we have those transformers deployed waiting for those messages. They stay idle most of the time. And when it gets busy, they cannot handle the loads and we have to manually deploy more transformers. Very painful.
We are looking forward to migrate our transformers as functions. Most of all, to use riff. Maybe something like
Is it possible?
May I kindly ask if it is possible to scope and milestone this feature please?
I am also interested in this.
All I can see from your demos, getting started website and quick starts are functions that responds to just some riff invoke or just some curl commands. This is unlikely to be representative of serverless architectures. It is not feasible to have all other services sending riff commands, neither sending curl. Other products have triggers based on data base change, GitHub change and much more.
A Kafka based (or any other message bus) trigger for functions created by Riff will be a very nice addition for this project. I find this serverless solution based on Knative to be quite nice. Having Kafka triggers would be a huge plus
Hello Riff Team,
First of all, thank you a lot for this very interesting project. I have been following Mark Fisher and Dave Syer talks, and learned a lot on this project. Currently, I am working on a use case, which uses Kafka messages as trigger of the functions.
Josh, Mark, Dave and many more did some demos, but it is with the old riff version. Furthermore, I am having a specific case, which I would like your help.
I have a Spring Cloud Function project, very basic. https://github.com/patpatpat123/riffquestion/tree/master/src/main/java/some/github/question
The functions are dockerized and deployed in Kubernetes.
riff service create riffquestion --image xxx/docker-riffquestion
And I am having an external service, that is supposed to trigger the functions like this:
Very happy about this project, I can see the beauty of Riff and Serverless. Even with the two functions! the first response [{"someBar":"HELLO"}] the second response [{"someBar":"world"}]
However, my real use case: I am having external services not sending the http post, but sending Kafka messages. Like this. (With other consumer services, but let's keep it simple for now)
This is my real use case. I believe it is fairly an interesting one, at least, a reasonable one. Is it possible to do so with Riff? If yes, could you please help me achieve this?
I believe the idea of having external Microservices, sending http posts, and/or Kafka messages to trigger the function corresponds to the core concepts of event driven architecture and Serverless architecture. I will be glad if I can achieve this.
Thank you a lot for this project, thank you a lot for your kind help.