Closed tetherit closed 8 years ago
Unfortunately there is not.
On Friday, May 6, 2016, Roman Gaufman notifications@github.com wrote:
Is there anyway to make sure that before I added a new job, there are no jobs like it already in the queue or are being processed?
— You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub https://github.com/brandonhilkert/sucker_punch/issues/177
Even when I set workers
to 1, it seems that sometimes more than one worker is running/job is executing.
@rmoriz min. # of threads is 2.
@brandonhilkert with Celluloid we could monkeypatch the min # of threads to 1 to get a serial execution :/
@rmoriz Actually, now that I look at it, it appears it should go down to 1. What makes you believe it's not?
Can you open a separate issue related to that alone?
(updated) looks like a user error on my side…
Okay my issue was: Unicorn + Sinatra + Sucker Punch. 3 unicorn workers * 1 Sucker Punch worker = 3 parallel jobs.
I've reduced the unicorn worker number in our setup (internal very-very low api/callback service) to fix the problem, so it is 1*1
Is there anyway to make sure that before I add a new job, there are no jobs like it already in the queue or are being processed?