JustinLove / autoscaler

Start/stop Sidekiq workers on Heroku
226 stars 42 forks source link

Handling multiple queues with a single worker dyno #52

Closed brandoncc closed 9 years ago

brandoncc commented 9 years ago

I know this topic has been talked about in two other issues, so I apologize for opening a third. I have been unable to find an answer on whether I can monitor three queues with a single worker, or if I am required to have a worker in my Procfile for each queue. I worry about db and redis connection limits, so I would really like to do this with a single dyno. Here is my sidekiq.yml:

unless Rails.env.test? || Rails.env.development?
  require 'autoscaler/sidekiq'
  require 'autoscaler/heroku_scaler'

  Sidekiq.configure_client do |config|
    config.client_middleware do |chain|
      heroku = Autoscaler::HerokuScaler.new
      chain.add Autoscaler::Sidekiq::Client, 'default' => heroku, 'mailers' => heroku, 'high_priority' => heroku
    end
    config.redis = { :size => 10 }
  end

  Sidekiq.configure_server do |config|
    config.server_middleware do |chain|
      chain.add(Autoscaler::Sidekiq::Server, Autoscaler::HerokuScaler.new, 60) # 60 second timeout
    end
    config.redis = { :size => 12 }
  end
end

I currently have three jobs in the mailers queue, but AutoScaler is not scaling up my worker to handle them.

Thanks for your time!

thatandyrose commented 9 years ago

hey @brandoncc, did you ever get this working?

brandoncc commented 9 years ago

No, I ended up switching to Cloud66 paired with EC2 for other reasons, so I am no longer using this gem. Sorry!

thatandyrose commented 9 years ago

no worries @brandoncc, thanks for the quick reply!

JustinLove commented 9 years ago

The configuration looks good on first pass. The client has three queues scaling the same worker type, and with only one worker server config is simple. Having the worker actually process multiple queues is sidekiq config

https://github.com/mperham/sidekiq/wiki/Advanced-Options