I am exploring options for breaking up a large job into a batch of smaller jobs. I know that sidekiq pro has this ability but I am wondering if this gem has the capacity to do this as well.
What I would like is for an array of values to be passed to the same worker and run in parralel. The number of workers would be determined dynamically by however number of values are passed into the array. So
Superworker.create(:MyBatchSuperworker, :user_ids) do
batch user_ids: :user_id do
BatchWorker :user_id
end
end
when calling:
MyBatchSuperworker.perform_async([10, 11, 12])
would create 3 BatchWorker jobs run at the same time (not serially). Without having tried out the DSL I get the sense that the current expected behavior is for the jobs to be run serially. Is that correct?
How hard would it be to add a feature for running the batches in parallel instead of serially if it does not exist with the current DSL?
I am exploring options for breaking up a large job into a batch of smaller jobs. I know that sidekiq pro has this ability but I am wondering if this gem has the capacity to do this as well.
What I would like is for an array of values to be passed to the same worker and run in parralel. The number of workers would be determined dynamically by however number of values are passed into the array. So
when calling:
would create 3 BatchWorker jobs run at the same time (not serially). Without having tried out the DSL I get the sense that the current expected behavior is for the jobs to be run serially. Is that correct?
How hard would it be to add a feature for running the batches in parallel instead of serially if it does not exist with the current DSL?