Spark generates a UI for monitoring a single job. This is currently unavailable to the user due to ingress rules (the URL is generated dynamically). It may be feasible to expose the UI to the user via port-forwarding or dynamic ingress rule updating. If not, update the decision record explaining why.
Update
Investigation proved that it is possible to expose a spark ui for each job. This will be implemented via a proxy pod with dynamic ingress rules.
Acceptance criteria
[x] When submitting a job a user is returned a url that for the spark ui for their job
[x] If the spark ui is not set up properly then the url returned to the user is Unavailable
[x] The spark ui proxy, service and ingress rule are deleted when a job is deleted
[x] It is possible to access the spark ui for multiple jobs simultaneously assuming they are both running
Spark generates a UI for monitoring a single job. This is currently unavailable to the user due to ingress rules (the URL is generated dynamically). It may be feasible to expose the UI to the user via port-forwarding or dynamic ingress rule updating. If not, update the decision record explaining why.
Update
Investigation proved that it is possible to expose a spark ui for each job. This will be implemented via a proxy pod with dynamic ingress rules.
Acceptance criteria
Unavailable