brandonhilkert / sucker_punch

Sucker Punch is a Ruby asynchronous processing library using concurrent-ruby, heavily influenced by Sidekiq and girl_friday.
MIT License
2.64k stars 114 forks source link

Unique jobs? #177

Closed tetherit closed 8 years ago

tetherit commented 8 years ago

Is there anyway to make sure that before I add a new job, there are no jobs like it already in the queue or are being processed?

brandonhilkert commented 8 years ago

Unfortunately there is not.

On Friday, May 6, 2016, Roman Gaufman notifications@github.com wrote:

Is there anyway to make sure that before I added a new job, there are no jobs like it already in the queue or are being processed?

— You are receiving this because you are subscribed to this thread. Reply to this email directly or view it on GitHub https://github.com/brandonhilkert/sucker_punch/issues/177


http://brandonhilkert.com

rmoriz commented 8 years ago

Even when I set workers to 1, it seems that sometimes more than one worker is running/job is executing.

brandonhilkert commented 8 years ago

@rmoriz min. # of threads is 2.

rmoriz commented 8 years ago

@brandonhilkert with Celluloid we could monkeypatch the min # of threads to 1 to get a serial execution :/

brandonhilkert commented 8 years ago

@rmoriz Actually, now that I look at it, it appears it should go down to 1. What makes you believe it's not?

Can you open a separate issue related to that alone?

rmoriz commented 8 years ago

(updated) looks like a user error on my side…

rmoriz commented 8 years ago

Okay my issue was: Unicorn + Sinatra + Sucker Punch. 3 unicorn workers * 1 Sucker Punch worker = 3 parallel jobs.

I've reduced the unicorn worker number in our setup (internal very-very low api/callback service) to fix the problem, so it is 1*1