Open halaei opened 7 years ago
There is nothing stopping you from doing this with the existing jobs system. You just create a new BatchSMSJob
and pass it the list of contacts to send to...
@tomschlick Only if you can create a BatchSmsJob beforehand. In cases when jobs are queued one-by-one, you cannot do this. For example, if an SMS is send to a user once he/she logs in or whatever, having a high traffic application, it will be necessary to batch process multiple SMS in the queue that are possibly pushed one-by-one.
Ah so you're looking to batch the same type of jobs together that are not submitted to the queue together...
That's a lot more complicated of an operation as the queue would have to read all of the pending items in the queue to know which ones to batch together.
To make it simple, my suggestion is to have specific type of queues with predefined batch Handlers. So everything pushed to 'webhook' queue will be processed by BatchWebhookCaller::handle(Collection $events)
. The code that pushed to queue must be responsible for not pushing jobs that BatchWebhookCaller
can't handle.
But can this be built generic enough to be provided as a framework functionality; instead of lets say a simple sms_to_send table that you fetch sms from yourself to send them in batches?
I agree with what @sisve said.
If you need to do this its better to be done with some intermediary step like storing them in the database & having a cron run every minute to throw them into the queue in batches.
@sisve Sms and webhooks was some examples. I think we can come up with a general solution that can be added to Laravel 5.5 (illuminate\queue). It should be driver based, as laravel queue currently supports multiple drivers. So maybe I prefer redis, and you like database driver and someone else go for SQS or his custom driver.
I have needed this feature. I think many others also like to have it in their framework?
It does present some new problems though, like what if one of the items in the batch causes an exception?
Does the whole thing fail? On retry do you exclude the one that previously failed or do you try it again?
These are just a lot easier to handle on a case by case basis (single jobs) vs a batch operation.
@tomschlick If an item causes an exception but the rest of items are successfully handled, it will be the responsibility of the batch handler to catch that exception and do something about it (mark it as failed will be an option). If any exception is raised by the batch handler, it means that all the jobs are failed. Whether or not and how and when retry the failed jobs is the responsibility of the Laravel queue system, just like it already is in the Laravel <= 5.4.
I agree the code of a batch handler is more complicated. For one thing it must iterate over some items. This will be the cost I like to pay in order to resolve performance issues and not waiting for a single slow network/IO operation when I can do 100 of them at once.
This suggested feature might also be useful when updating an Elastic search index. Elastic search it is better to use the batch update feature. https://www.elastic.co/blog/found-keeping-elasticsearch-in-sync#the-bulk-api-a-must-for-most-applications
I've been working into implementing this in my project.
I ended up creating BatchWorker
that extends Worker
. On the while(true)
loop I store my jobs on a protected $batchJobs = []
property. When conditions are met (this is the batch size is reached or there have been enought loops without new jobs), I wrap all this Jobs into a SyncJob
and execute runJob()
. Is a super nasty implementation, but enough to start tinkering.
i think it would be useful to have a function or a closure, called before handle, with a condition, like Sms::all()->count() > 100, to execute the job when needed. a delay would be used to define the checking frequency of the condition, like the schedule. it would be more generic, and the dynamic payload will be generated in the handle, from a sms table for example.
Another solution could be to update the payload of the job, to add new sms, until reached the limit
An important decision; are we batching anything on a given queue causing a batch to contain jobs of different types, or are we limiting the batching to job types?
A simple scenario; the default queue with the following content, in this order:
Should this result in ...
@sisve I think it will be simpler and cleaner to limit to have jobs of the same type on each queue.
SendSmsJob
should be pushed to a queue that is processed by a SendSmsHandler
worker. SendSmsJob
but not SendEmailJob
.Yes, I agree that you have to push only one type of job to a special queue that process in batch.
Also, in my handler, I can check if the payload
has batch
key so I can process batch records or single ones.
My use case is ElasticSearch _bulk
indexing.
I added two options to my command: max-ticks-idle
and batch-size
. That way I can trigger processing whenever the batch-size
is reached, or when a number of ticks have been reached and no new messages have been produced (ie. my batch-size is 50, but I have 49 records waiting for indexing).
@halaei this is now available in Laravel 8
@cyrrill no. This is actually a different concept with a similar name.
@halaei Ok, thanks for clarifying, I landed here from a Google search. Curious, 3 years after this ticket was opened, is there some code regarding this feature to see?
I'm looking into sending batches of message jobs, and would be interested in seeing various existing implementations.
See bulk queue wiki. Sometimes, it will be nice to handle queued jobs, that are possibly pushed one-by-one, in batch. Examples in my mind are:
Sample code in mind:
queue.php
file:Batch worker classes;
Console commands: