Closed cforce closed 5 years ago
There is some chance that dataflow apps will be functions in some future version. The convergence is explicit and obvious (spring cloud stream is the glue).
I wouldn't look at any other of your parallels and draw too many conclusions (and some of them have one end that doesn't exist anyway, like spring-cloud-web
and spring-cloud-function-registry-service
). The deployer in particular is highly experimental, but could be a taste of what a dataflow server would look like if all apps were running locally (i.e. no intention to manage anything at high scale or in XaaS).
The spring cloud function adapter from my understanding is a wrapper around spring cloud function implementations to deploy them on supported FaaS providers. How do i run a SCF in dataflow? I found a starter-app - is that the "wrapper" for calling SCF in SCDflowServer? https://github.com/spring-cloud-stream-app-starters/function/blob/master/spring-cloud-starter-stream-app-function/README.adoc
I guess that is a wrapper. It would be just as easy for some (most?) apps to just include spring-cloud-function-stream as a dependency, and then they are automatically SCDF apps. SCF only requires the binding AFAIK.
SpringCloudFunction itself does not offer features to compose data flow streams, but Spring cloud data flow does. I am asking me if dataflow server will allow me define streams using tasks that I have deployed on public FaaS?
Currently the DataFlowServer allows me "only" to deploy on onprem CaaS Kuberntes or PaaS, Cloudfoundry / Openshift. But will the dataflow server will deploy spring cloud tasks also on FaaS (e.g Riff) and bind it to Kafka?
Just an idea: If functions in public cloud FaaS/or Riff are published via an OpenServiceBroker as "FaaS" task catalogue, we could register them automatically cloud native in SpringClouddataFlow server an then compose streams with it.
Additional I would like to know what riff integration is planned or how spring cloud data flow relates to Pivotals FaaS Plattform Riff
In short Data Flow orchestrates Spring Cloud Stream applications over messaging middleware of your choice (Rabbit, Kafka etc). As of now Spring Cloud Stream is fully integrated with Spring Cloud Function by providing an alternative programming model. So, to compose data flow from individual functions is no different then compose data flow from any other spring cloud stream app.
Registration in DataFlowServer of producer/consumer as stream app (spring cloud stream) and function (spring cloud function) is needed, so this apps can be downloaded, used for stream/pipe/dataflow layouts and afterwards to be shipped/deployed on Kubernetes or PCF (or local for developing). What dataflow server does not support is to register Azure/AWS or Google Functions which are already deployed on their clouds plattform and to use them as producer/consumer in your dataflow stream. A FaaS platform like project Eirini (proceeded from RIF) besides native Kubernetes as deployment target would be a cloud native option.
I am going to close this as stale and if anything the discussion is more about data-flow hence out of scope of SCF. If you have more specific suggestion/request, please open a new issue.
Is the only difference the granularity (microservice >data flow > task>function) of the deployment unit/ business logic boxed? Please help to find out what will stick together, what may be dropped in favour or just melts together of this two projects where i see so much coverage in patterns an functionality,
While dataflow already has support for Kubernetes to scale out running processes i don't undertstand how spring-cloud-function-deployer will solve that stateful process managment of function instances at high scale and on which provided XaaS.
Tx for clarifying differences, commonalities and common roadmap of this two projects.