spring-cloud / spring-cloud-dataflow

A microservices-based Streaming and Batch data processing in Cloud Foundry and Kubernetes
https://dataflow.spring.io
Apache License 2.0
1.11k stars 582 forks source link

After the boot 3 schedule fails, the deployer information is missing upon restart. #5777

Open alsdud154 opened 7 months ago

alsdud154 commented 7 months ago

You are running DataFlow with Helm Chart version 26.8.1 [App version: 2.11.2]. You are using a data flow to make a spring batch a task. Run the task uses the schedule. Set the deployer volume in the scheduler.main.properties when you create the schedule. The operation was executed by the schedule, but the operation failed. The failed operation was re-run by pressing the Restart button. At this point, the newly executed task is executed by omitting the deployer volume value set in the schedule.

I think it's a bug. This problem does not occur if you run a failed job with "LAUCH TASK" instead of a schedule and then run it again.

Please let me know how to solve it.

corneil commented 7 months ago

This does seem to be a bug.

corneil commented 7 months ago

@alsdud154 Was the deployer property in question visible in the task execution view? Does your application include the spring cloud task dependency?

alsdud154 commented 7 months ago

@corneil Thanks for your reply

task information executed by schedule[Job Exection Id 151]

image

task information for failed jobs using the restart button[Job Execution Id 152]

image

Spring batch applications running with k8s include org.springframework.cloud:spring-cloud-starter-task.

image
cppwfs commented 7 months ago

SCDF creates a task manifest when launching a task that stores the deployment properties. However, when SCDf schedules a task the deployment information is passed onto the cronjob but dataflow does not store the deployment information. Thus when the scheduled task is fails, and needs to be restarted, SCDF does not have the deployment information, because it does not have the manifest. SCDF needs to be updated to store this manifest (deployment information) for the schedule.

A possible workaround is to use the global properties (if your tasks use the same deployment properties: https://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#configuration-kubernetes-app-props

alsdud154 commented 7 months ago

@cppwfs Is it correct that SCDF version up is needed after developing additional features to store manifest (deployment info)? I'll use the global properties for now, but I hope we've added a feature that's saved in manifest in the future.

thank you.