logstash-plugins / logstash-output-s3

Apache License 2.0
58 stars 151 forks source link

The XML you provided was not well-formed or did not validate against our published schema #131

Open runningman84 opened 7 years ago

runningman84 commented 7 years ago
Updating logstash output to 4.0.6 from 3.x broke the output:

[2017-03-06T11:59:24,494][ERROR][logstash.outputs.s3 ] Uploading failed, retrying {:exception=>Aws::S3::Errors::MalformedXML, :message=>"The XML you provided was not well-formed or did not validate against our published schema", :path=>"/data/logstash/s3_out/ls.s3.ip-10-18-0-154.eu-central-1.compute.internal.2017-03-01T14.16.part487.txt", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/seahorse/client/plugins/raise_response_errors.rb:15:in call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_sse_cpk.rb:19:incall'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/s3_accelerate.rb:33:in call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/aws-sdk-core/plugins/param_converter.rb:20:incall'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/seahorse/client/plugins/response_target.rb:21:in call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/seahorse/client/request.rb:70:insend_request'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-core-2.3.22/lib/seahorse/client/base.rb:207:in put_object'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/file_uploader.rb:42:input_object'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/file_uploader.rb:49:in open_file'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/file_uploader.rb:41:input_object'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/file_uploader.rb:34:in upload'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/aws-sdk-resources-2.3.22/lib/aws-sdk-resources/services/s3/object.rb:251:inupload_file'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.6/lib/logstash/outputs/s3/uploader.rb:38:in upload'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-s3-4.0.6/lib/logstash/outputs/s3/uploader.rb:29:inupload_async'", "org/jruby/RubyProc.java:281:in call'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/concurrent-ruby-1.0.0-java/lib/concurrent/executor/java_executor_service.rb:94:inrun'", "Concurrent$$JavaExecutorService$$Job_1733552426.gen:13:in `run'"]}

eastokes commented 7 years ago

I got this error also, 5.2.0 and 4.0.5, it was related to setting temporary_directory, still not sure why. I removed that and it worked.

sashazykov commented 7 years ago

Having the same issue after upgrading to 5.2.0.

kmcgerald commented 7 years ago

I was having the same problem on my logstash boxes and did not have a "temporary_directory" in the config but I stopped logstash, removed a part file in "/tmp/logstash/ls.s3.*" and started logstash. No errors since. The files were a few months old and probably are from before the upgrade. I didn't have time to sort out why the new plugin didn't like the old files.

runningman84 commented 7 years ago

the temporary_directory option is quite important for us because we use to also for monitoring....

timg456 commented 7 years ago

I'm having this issue as well. It started on logstash 5.4.2. I upgraded to logstash 5.6.2, but that didn't fix the problem. I'm sending a single valid json object for my log data. I deleted the log files that were stuck retrying for about two weeks, but new log files are generated in the temp directory that throw the error.

Here's what my config looks like.

input {
    bucket => "BUCKET"
    prefix => "PATH"
    access_key_id => "KEY"
    secret_access_key => "SECRET_KEY"
    region => "us-east-1"
    backup_add_prefix => "sent-to-logstash-"
    backup_to_bucket => "BUCKET"
    interval => 120
    codec => "json"
    tags => "TAG"
    delete => true
}
idsvandermolen commented 6 years ago

we ran into same issue after upgrading logstash from 2.4.1 to 5.6.2 with S3 output plugin (upgraded plugin from 3.2.0 to 4.0.11). Seems to be related with new filename format (has uuid). Work-around seems to be to remove files from temporary_directory

VictorCovalski commented 5 years ago

I'm getting the same error with Logstash 6.5.1 and plugin version v4.1.7.

Below is my configuration

output { s3 { region => "us-east-1" bucket => "logstash-poc-logs" canned_acl => "private" rotation_strategy => "size" }

bminahan-kc commented 4 years ago

+1

this also still happens in more recent version 7.9.1.

stopping logstash, removing the temp files, and starting again fixes the issue.