logstash-plugins / logstash-input-rabbitmq

Apache License 2.0
30 stars 43 forks source link

Slow throughput with 3.1.4\3.1.5 #69

Closed thenom closed 8 years ago

thenom commented 8 years ago

Hi,

I have recently upgraded this plugin due to https://github.com/logstash-plugins/logstash-input-rabbitmq/issues/67 but previously i was getting well over 10k messages/s and now get approx 2k.

There is nothing other than this plugin changed and have had to go back to 3.1.3.

Please let me know if you need any information regards setup.

Thanks, Simon

suyograo commented 8 years ago

@thenom can you provide us a jstack output for LS ? Can you get us about 5 snapshots, repeated at 2 sec interval?

andrewvc commented 8 years ago

@thenom out of curiosity are you using ACKing?

andrewvc commented 8 years ago

@thenom can you share your config with us?

thenom commented 8 years ago

@andrewvc,

Sorry about the delay in replying. I don't use ACK for the consumers. Here is the input config:

    rabbitmq {
        queue => "prod"
        durable => true
        host => "sdc-vip.prod.local"
        exchange => "prod_incoming_logs"
        key => "prod.xml.*"
        user => "logstash"
        password => "logstash"
        threads => 1
        prefetch_count => 256
        codec => "json"
        ack => false
    }

This config didn't change between updates and the throughput went back up when i went back to 3.1.3.

@suyograo I will find running this test difficult because our dev setup does not have anywhere near the throughput our prod setup does and as i cannot put this back into live, i will not be able to replicate.

andrewvc commented 8 years ago

@thenom I have a patch set that makes this plugin fast again in #73

thenom commented 8 years ago

@andrewvc Great, cheers. I will try it out when i get back into work.

andrewvc commented 8 years ago

@thenom any updates here? Would be great to have some real world feedback

thenom commented 8 years ago

Sorry about the delay. I have set it up in our development environment but not loaded it up for functional testing yet. Will let you know. Hopefully it will be tomorrow.

On Wed, 16 Mar 2016 14:15 Andrew Cholakian, notifications@github.com wrote:

@thenom https://github.com/thenom any updates here? Would be great to have some real world feedback

— You are receiving this because you were mentioned. Reply to this email directly or view it on GitHub https://github.com/logstash-plugins/logstash-input-rabbitmq/issues/69#issuecomment-197350417

thenom commented 8 years ago

Hi,

Ok, after a nightmare week at work i finally got time to test in dev.

I kept getting the following problem:

{:timestamp=>"2016-03-18T15:51:14.552000+0000", :message=>"No exchange type declared for exchange incoming_logs!", :class=>"LogStash::ConfigurationError", :location=>"/opt/logstash/vendor/local_gems/94c9285d/logstash-input-rabbitmq-3.3.1/lib/logstash/inputs/rabbitmq.rb:193:in `bind_exchange!'", :level=>:warn}

I got this initially on startup even through the queue, binding and exchange existed so i deleted the queue and just left the exchange, restarted logstash and the queue came back with no binding and still the same error.

[root@pup-rabbit01 ~]# rabbitmqctl list_exchanges
Listing exchanges ...
        direct
amq.direct      direct
amq.fanout      fanout
amq.headers     headers
amq.match       headers
amq.rabbitmq.log        topic
amq.rabbitmq.trace      topic
amq.topic       topic
incoming_logs   topic
mi_incoming     topic
...done.

Is this because my exchange type is topic?

andrewvc commented 8 years ago

@thenom the latest version of this plugin requires that you declare the exchange type explicitly with a new 'exchange_type' option.

andrewvc commented 8 years ago

This should be fixed in the recently released 4.0.0 version. Thanks for the report @thenom , please open a new issue if you still have problems here!

thenom commented 8 years ago

Hi @andrewvc,

Once again, sorry for the delays. 3.3.1 is functionally running fine on my dev stack with the additional exchange_type option. Ironically, the reason i have been so busy is our production stack has throughput issues so can't fully load it. I am sure it will be fine but will let you know if i get anymore issues.

Thanks.