Azure / azure-diagnostics-tools

Plugins and tools for collecting, processing, managing, and visualizing diagnostics data and configuration
98 stars 94 forks source link

[logstash-input-azureblob] Response 403 Authentication Failed #208

Open briangardner opened 4 years ago

briangardner commented 4 years ago

I'm getting a 403 auth error when trying to use this plugin. I've triple checked my storage account, access key, and container settings. I think it may be caused by the library in the plugin thats accessing Azure Storage.

Error Message (with identifying info stripped):

[2019-10-09T15:30:53,517][ERROR][logstash.inputs.logstashinputazureblob] Oh My, An error occurred. AuthenticationFailed (403): Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
RequestId:b97ce729-701e-0118-11b6-7e28f3000000
Time:2019-10-09T15:30:54.1758627Z: ["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/azure-core-0.1.15/lib/azure/core/http/retry_policy.rb:58:in `call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/azure-core-0.1.15/lib/azure/core/http/http_request.rb:110:in `block in with_filter'", "org/jruby/RubyMethod.java:132:in `call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/azure-core-0.1.15/lib/azure/core/http/signer_filter.rb:28:in `call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/azure-core-0.1.15/lib/azure/core/http/http_request.rb:110:in `block in with_filter'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/azure-core-0.1.15/lib/azure/core/service.rb:36:in `call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/azure-core-0.1.15/lib/azure/core/filtered_service.rb:34:in `call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/azure-core-0.1.15/lib/azure/core/signed_service.rb:41:in `call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/azure-storage-0.15.0.preview/lib/azure/storage/service/storage_service.rb:62:in `call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/azure-storage-0.15.0.preview/lib/azure/storage/blob/blob_service.rb:62:in `call'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/azure-storage-0.15.0.preview/lib/azure/storage/blob/container.rb:624:in `list_blobs'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:274:in `block in list_all_blobs'", "org/jruby/RubyKernel.java:1425:in `loop'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:272:in `list_all_blobs'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:347:in `register_for_read'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:165:in `process'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:151:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:314:in `inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:306:in `block in start_input'"] {:exception=>#<Azure::Core::Http::HTTPError:2014 @status_code: 403, @http_response: #<Azure::Core::Http::HttpResponse:0x4e8774b6 @http_response=#<Faraday::Response:0xe562964 @on_complete_callbacks=[], @env=#<Faraday::Env @method=:get @body="\xEF\xBB\xBF<?xml version=\"1.0\" encoding=\"utf-8\"?><Error><Code>AuthenticationFailed</Code><Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.\nRequestId:b97ce729-701e-0118-11b6-7e28f3000000\nTime:2019-10-09T15:30:54.1758627Z</Message><AuthenticationErrorDetail>The MAC signature found in the HTTP request '7EZxgw4haDiQQa+Nf+EmLFDpXEguiNvnF/8I1SQThl8=' is not the same as any computed signature. Server used following string to sign: 'GET\n\n\n\n\napplication/atom+xml; charset=utf-8\n\n\n\n\n\n\nx-ms-date:Wed, 09 Oct 2019 15:29:21 GMT\nx-ms-version:2016-05-31\n/<storageAccount>/<container>'.</AuthenticationErrorDetail></Error>" @url=#<URI::HTTPS https://<storageAccount>.blob.core.windows.net/<container>> @request=#<Faraday::RequestOptions open_timeout=60> @request_headers={"User-Agent"=>"logstash-input-azureblob/0.9.11; Azure-Storage/0.15.0-preview (Ruby 2.5.3-p0; Linux linux)", "x-ms-date"=>"Wed, 09 Oct 2019 15:29:21 GMT", "x-ms-version"=>"2016-05-31", "DataServiceVersion"=>"1.0;NetFx", "MaxDataServiceVersion"=>"3.0;NetFx", "Content-Type"=>"application/atom+xml; charset=utf-8", "Content-Length"=>"0", "Authorization"=>"SharedKey <storageAccount>:7EZxgw4haDiQQa+Nf+EmLFDpXEguiNvnF/8I1SQThl8="} @ssl=#<Faraday::SSLOptions verify=true> @response=#<Faraday::Response:0xe562964 ...> @response_headers={"content-length"=>"690", "content-type"=>"application/xml", "server"=>"Microsoft-HTTPAPI/2.0", "x-ms-request-id"=>"b97ce729-701e-0118-11b6-7e28f3000000", "date"=>"Wed, 09 Oct 2019 15:30:53 GMT", "connection"=>"close"} @status=403 @reason_phrase="Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.">>, @uri=#<URI::HTTPS https://<storageAccount>.blob.core.windows.net/<container>>>, @uri: #<URI::HTTPS https://<storageAccount>.blob.core.windows.net/<container>>, @description: "Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.\nRequestId:b97ce729-701e-0118-11b6-7e28f3000000\nTime:2019-10-09T15:30:54.1758627Z", @type: "AuthenticationFailed">}

I've replaced the storageAccount w/ <storageAccount>, and the container with <container>.

Steps to reproduce:

  1. Build docker image w/ latest logstash (7.4.0) and install the azureblobs plugin:
    FROM docker.elastic.co/logstash/logstash:7.4.0
    RUN logstash-plugin install logstash-input-azuretopic
    RUN logstash-plugin install logstash-input-azure_event_hubs
    RUN logstash-plugin install logstash-input-azureblob
  2. Use a pipeline like the following:

    
    input { 
    azureblob {
     id => "logs-input"
     storage_account_name => "<storageAccount>"
     storage_access_key => "<access Key>"
     container => "<container>"
     registry_create_policy => "start_over"
    
    }
    }

filter { if [ip] { geoip { source => "ip" remove_field => "ip" } }

}

output { elasticsearch { id=> "logs-output" hosts => ["elasticsearch:9200"] index => "logs-%{+YYYY.MM.dd}" codec => json } }

ArnauBlanch commented 4 years ago

Hi, I'm having this issue too and I can't figure out how to solve it. 😔

Any news on this?

Thanks!

Mukul-Srivastava commented 4 years ago

Hi, I am facing the same issue , though i am using centos VM not docker image. Same version of plugin working perfectly with ubuntu VM . Now i am trying to run it on centos 7.5 but fails. @ @briangardner , which base os is being used in logstash docker image . Can you please check if that is using centos.

Mukul-Srivastava commented 4 years ago

Hi Folks, Seems i have figured out the issue. Its not about OS dependency rather plugin itself which seems non-compatible with logstash version 7.4.x. I have installed logstash version 7.2 and then installed logstash-input-azureblob plugin . Its working like charm.

virgilp commented 4 years ago

Any idea what changed/ why this plugin is no longer working in 7.4? [edit] actually, it works for me with 7.4.0 (and all versions up to it), doesn't work with 7.4.1 and 7.4.2 And it's actually more nuanced:

Plugin version is the same (logstash-input-azureblob (0.9.13)) so something else changed. It's most likely caused by the upgrade of jruby to 9.2.8.0 ; but at this point it's probably safe to assume that Microsoft doesn't care about this repo/ the plugin is unmaintained.

arunp-motorq commented 4 years ago

Am having the same error too. Using 7.4.2 of logstash

virgilp commented 4 years ago

FYI I gave up on this plugin and implemented my own - the performance was horrendous (at least for my usecase). It may work if you have a small number of blobs (that get updated/ appended to a lot), but otherwise (if you have many blobs), forget it, you're just wasting time.