Sometimes we need to suspend for a short time a running spark. Maybe to manipulate a kafka consumer group, or to prepare some elastic/cassandra/mongo/... entities, or any other operations needing a stable state of the application and its data. To do it currently we need to delete the sparkapplication object, and loose the object and his history.
Like k8s CronJobs api, SparkAplication should be suspended to stop croning streaming batches until we finish our operations.
I suggest a new SparkApplication state: Suspended.
The new state can be reached from any state expect final states: Completed, Failed
To switch to this new state, we only need to annotate the sparkapplication object. To resume a suspended sparkapplication we only need to remove the annotation.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Hi all,
Sometimes we need to suspend for a short time a running spark. Maybe to manipulate a kafka consumer group, or to prepare some elastic/cassandra/mongo/... entities, or any other operations needing a stable state of the application and its data. To do it currently we need to delete the sparkapplication object, and loose the object and his history. Like k8s CronJobs api, SparkAplication should be suspended to stop croning streaming batches until we finish our operations.
I suggest a new SparkApplication state: Suspended. The new state can be reached from any state expect final states: Completed, Failed
To switch to this new state, we only need to annotate the sparkapplication object. To resume a suspended sparkapplication we only need to remove the annotation.