chaps-io / gush

Fast and distributed workflow runner using ActiveJob and Redis
MIT License
1.03k stars 103 forks source link

Processing job sequentially #24

Closed akshayrawat closed 8 years ago

akshayrawat commented 8 years ago

Is it possible to process jobs sequentially, based on arguments?

Example jobs:

Jobs: [User A, Operation A], [User B, Operation B], [User A, Operation C], [User B, Operation D], .. stream of jobs

Process all jobs of each user sequentially, but concurrently across users.

Example:

  1. Sequentially execute all jobs for User A: Operation A, Operation C
  2. Sequentially execute all jobs for User B: Operation B, Operation D

However (1), (2) get executed concurrently.

I couldn't make out from the documentation if its possible. Any pointers would be much appreciated.

pokonski commented 8 years ago

Hi @akshayrawat, one way would be to dynamically generate subgraphs inside the Workflow configure method, like this:

def configure(users)
  users.each do |user|
    operation_a = run OperationA, params: { user_id: user.id}
    operation_b = run OperationB, params: { user_id: user.id}, after: operation_a
    run OperationC, params: { user_id: user.id}, after: operation_b
  end
end

run returns the id of the registered job so you can use it later on. Haven't done that yet, but should be doable :)

I'll try that myself and make a spec for this case to see if it makes sense.

ferusinfo commented 8 years ago

@akshayrawat, as @pokonski suggested, the solution is quite simple and I can confirm it is working in my production workflows (I am processing the jobs sequentially and after all of them I generate a status report that is triggering a Slack Webhook).