logstash-plugins / logstash-output-sqs

Apache License 2.0
4 stars 22 forks source link

AWS::SQS::Errors::BatchRequestTooLong #10

Closed NoumanSaleem closed 8 years ago

NoumanSaleem commented 8 years ago

Started seeing this in our development environment after it had been working smoothly for over a week.

{:timestamp=>"2016-04-06T13:56:28.015000+0000", :message=>"Failed to flush outgoing items", :outgoing_count=>10, :exception=>"AWS::SQS::Errors::BatchRequestTooLong", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/client.rb:375:in `return_or_raise'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/core/client.rb:476:in `client_request'", "(eval):3:in `send_message_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-v1-1.66.0/lib/aws/sqs/queue.rb:551:in `batch_send'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-sqs-2.0.2/lib/logstash/outputs/sqs.rb:130:in `flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:216:in `buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:193:in `buffer_flush'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.22/lib/stud/buffer.rb:159:in `buffer_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-sqs-2.0.2/lib/logstash/outputs/sqs.rb:122:in `receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/outputs/base.rb:83:in `multi_receive'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/outputs/base.rb:83:in `multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/output_delegator.rb:119:in `worker_multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/output_delegator.rb:118:in `worker_multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/output_delegator.rb:65:in `multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:290:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:290:in `output_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:221:in `worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.2.2-java/lib/logstash/pipeline.rb:190:in `start_workers'"], :level=>:warn}

logstash 2.2 logstash-codec-plain 2.0.2 logstash-output-sqs 2.0.2

config:

output {
  sqs {
    queue => "xx"
    region => "xx"
  }
}
NoumanSaleem commented 8 years ago

http://docs.aws.amazon.com/sdkforruby/api/Aws/SQS/Client.html#send_message_batch-instance_method

The maximum allowed individual message size is 256 KB (262,144 bytes). The maximum total payload size (i.e., the sum of all a batch's individual message lengths) is also 256 KB (262,144 bytes).

interesting that the TOTAL batch size is the same as the individual message limit.

I will reduce the batch count. thank you!

ph commented 8 years ago

With the nature of the event I think we should be able to dynamically adjust the batch size, if the current batch size if too big for the SQS limit, would you mind opening an issue for that?

NoumanSaleem commented 8 years ago

@ph that would be a great enhancement! I will open an issue