Explore how an app can use kubectl apply -f to create arbitrary spark streaming jobs.
Acceptance criteria:
Dockerfile with app
App must accept REST Api
App must call kubectl
App must take SQL through the REST Api and create a spark streaming job from it
I had originally thought this would run on the same k8s cluster, but now I think this could be a separate autoscaling instance...no need to have this be tied directly to the k8s instance per client.
Thinking out loud...this may be better as a github workflow. Provisioning spark streaming jobs and running kubectl on the cluster. Out of the box monitoring/alerting (via github actions), and better security.
Explore how an app can use kubectl apply -f to create arbitrary spark streaming jobs.
Acceptance criteria:
I had originally thought this would run on the same k8s cluster, but now I think this could be a separate autoscaling instance...no need to have this be tied directly to the k8s instance per client.