Azure / azure-diagnostics-tools

Plugins and tools for collecting, processing, managing, and visualizing diagnostics data and configuration
98 stars 92 forks source link

Logs are arriving hour(s) later #108

Closed branimirackovic closed 7 years ago

branimirackovic commented 7 years ago

Hi,

Thank you for this plugin and effort, I am using system with three storage accounts and one ELK. I have a problem with slow reading of data from storage depending of traffic I have. Data needs from several minutes to several hours to arrive to Logstash.

Does anyone else experience the same issue? These are systems with about 10k records per hour. Thanks Acko

02sandeepreddy commented 7 years ago

Hi branimirackovic , Glad you got the plugin worked. I tried seting up the plugin and couldnt able to feed the logs from azure webapp in to on premise ELK instance. This is what we did. Appreciate if you could let me know the right procedure

  1. Installed logstash azure plugin 2 created a file inside /logstash/conf.d (azurewebapp_test)
  2. Inside the azure webapp_test file entered the azure storage name, access key, container name
  3. restarted the logstash

Azure webapp on cloud ELK on AWS cloud Could you let me know how to install the azure logstash plugin and make it work. Thank you

brahmnes commented 7 years ago

Hi branimirackovic,

Can you clarify whether you are using the azureblob or azurewadtable plugins? 02sandeepreddy, I assume you are using the azureblob plugin. Have you looked at logstash log file under /var/log?

02sandeepreddy commented 7 years ago

Hello Brahmes, Thanks for your response. Correct, Im using azure blob plugin
logstash-plugin install logstash-input-azureblob

Logstash version 5.6

Inside Conf.d O created a file "azure_test.conf" with input as below

input { azureblob { storage_account_name => "XXX" storage_access_key => "XXXX" container => "containername" codec => "line" } } output { stdout {codec => rubydebug}

    elasticsearch {
            hosts => [ "IP1:9200", "IP2:9200", "IP3:9200" ]
            index => "log-azureblob-%{+YYYY.MM.dd}"
    }

}

and ran : /usr/share/logstash/bin/logstash --path.settings /etc/logstash -f azure_test.conf

Here is the log. /var/log/logstash/logstash-plain.log

[2017-10-12T18:29:04,701][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2017-10-12T18:29:07,684][WARN ][logstash.agent ] stopping pipeline {:id=>"main"} [2017-10-12T18:29:07,686][ERROR][logstash.inputs.logstashinputazureblob] Oh My, An error occurred. undefined method type' for #<NoMethodError:0x55e71f1f>: ["/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:309:inacquire_lease'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:417:in cleanup_registry'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:148:instop'", "/usr/share/logstash/logstash-core/lib/logstash/inputs/base.rb:89:in do_stop'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:515:in shutdown'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:426:instop_pipeline'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:442:in shutdown_pipelines'", "org/jruby/RubyHash.java:1342:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:442:in shutdown_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:139:inshutdown'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:328:in execute'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:67:inrun'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:204:in run'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:132:inrun'", "/usr/share/logstash/lib/bootstrap/environment.rb:71:in (root)'"] {:exception=>#<NoMethodError: undefined methodtype' for #>} [2017-10-12T18:29:15,515][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"} [2017-10-12T18:29:15,517][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"} [2017-10-12T18:29:15,958][INFO ][logstash.inputs.logstashinputazureblob] Using version 0.9.x input plugin 'azureblob'. This plugin should work but would benefit from use by folks like you. Please let us know if you find bugs or have suggestions on how to improve this plugin. [2017-10-12T18:29:16,244][ERROR][logstash.pipeline ] Error registering plugin {:plugin=>"#<LogStash::OutputDelegator:0x39158e7a @namespaced_metric=#<LogStash::Instrument::NamespacedMetric:0x520d17ef @metric=#<LogStash::Instrument::Metric:0x4156f05a @collector=#<LogStash::Instrument::Collector:0x38bc82e @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x1678f07b @store=#, @structured_lookup_mutex=#, @fast_lookup=#>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :\"01f667decd73cdeadd241c73007856fa08f847f9-3\"]>, @metric=#<LogStash::Instrument::NamespacedMetric:0x7c5b70c @metric=#<LogStash::Instrument::Metric:0x4156f05a @collector=#<LogStash::Instrument::Collector:0x38bc82e @agent=nil, @metric_store=#<LogStas

xiaomi7732 commented 7 years ago

There are several interesting spots in here. Let's break them down.

  1. The inital issue, started by @branimirackovic, is that the pipeline feels slow. This worth a trouble shooting. As a matter of fact, 10k per hour is way blow the capbility of the plugin. One possible is the data didn't reach azure blob until accumulated for a hour - WAD IIS logs uses this mechanism. @branimirackovic, could you please double check if the data is written to azure blob in real time? We can start trouble shooting from that point on.

  2. The second issue, mentioned by @02sandeepreddy is that the configuration didn't work. An obvirous issue by callstack is that the exception is not well handled - #110, and the fix will be in #111. More interesting question here is, why is the exception. There's not enough info in the callstack to tell at this point. @02sandeepreddy, is it possible to apply the fixed in #111 in your environment and give it another shoot? Paste the logs and let's trouble shoot from there.

Thanks, Saar

02sandeepreddy commented 7 years ago

@xiaomi7732 I get the error in the logs..Below is the procedure I have done Changed the code to https://github.com/Azure/azure-diagnostics-tools/pull/111 Build gem file then install logstash plugin from that gem file and started logstash.

Log output .... tailf /var/log/logstash/logstash-plain.log [2017-10-17T17:12:28,335][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2017-10-17T17:13:01,172][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"} [2017-10-17T17:13:01,175][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"} [2017-10-17T17:13:04,168][INFO ][logstash.inputs.logstashinputazureblob] Using version 0.9.x input plugin 'azureblob'. This plugin should work but would benefit from use by folks like you. Please let us know if you find bugs or have suggestions on how to improve this plugin. [2017-10-17T17:13:05,860][ERROR][logstash.pipeline ] Error registering plugin {:plugin=>"#<LogStash::OutputDelegator:0x7f26b5c3 @namespaced_metric=#<LogStash::Instrument::NamespacedMetric:0x2c3be7c8 @metric=#<LogStash::Instrument::Metric:0x71bf56d @collector=#<LogStash::Instrument::Collector:0x7dc30688 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x37012c5e @store=#, @structured_lookup_mutex=#, @fast_lookup=#>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :\"01f667decd73cdeadd241c73007856fa08f847f9-3\"]>, @metric=#<LogStash::Instrument::NamespacedMetric:0x798d1ff2 @metric=#<LogStash::Instrument::Metric:0x71bf56d @collector=#<LogStash::Instrument::Collector:0x7dc30688 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x37012c5e @store=#, @structured_lookup_mutex=#, @fast_lookup=#>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs]>, @logger=#<LogStash::Logging::Logger:0x3f19166a @logger=#>, @out_counter=LogStash::Instrument::MetricType::Counter - namespaces: [:stats, :pipelines, :main, :plugins, :outputs, :\"01f667decd73cdeadd241c73007856fa08f847f9-3\", :events] key: out value: 0, @in_counter=LogStash::Instrument::MetricType::Counter - namespaces: [:stats, :pipelines, :main, :plugins, :outputs, :\"01f667decd73cdeadd241c73007856fa08f847f9-3\", :events] key: in value: 0, @strategy=#<LogStash::OutputDelegatorStrategies::Shared:0xd01c4c4 @output=<LogStash::Outputs::ElasticSearch hosts=>[//ELASTICSEARCH01:9200, //ELASTICSEARCH02:9200, //ELASTICSEARCH03:9200], index=>\"log-azureblob-%{+YYYY.MM.dd}\", id=>\"01f667decd73cdeadd241c73007856fa08f847f9-3\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_b3922808-4dd6-484b-9006-235bab3f7256\", enable_metric=>true, charset=>\"UTF-8\">, workers=>1, manage_template=>true, template_name=>\"logstash\", template_overwrite=>false, idle_flush_time=>1, doc_as_upsert=>false, script_type=>\"inline\", script_lang=>\"painless\", script_var_name=>\"event\", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>\"index\", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>>, @id=\"01f667decd73cdeadd241c73007856fa08f847f9-3\", @time_metric=LogStash::Instrument::MetricType::Counter - namespaces: [:stats, :pipelines, :main, :plugins, :outputs, :\"01f667decd73cdeadd241c73007856fa08f847f9-3\", :events] key: duration_in_millis value: 0, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x7adc2948 @metric=#<LogStash::Instrument::Metric:0x71bf56d @collector=#<LogStash::Instrument::Collector:0x7dc30688 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x37012c5e @store=#, @structured_lookup_mutex=#, @fast_lookup=#>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :\"01f667decd73cdeadd241c73007856fa08f847f9-3\", :events]>, @output_class=LogStash::Outputs::ElasticSearch>", :error=>"HttpClientBuilder not found in packages org.apache.http.client.methods, org.apache.http.client.entity, org.apache.http.client.config, org.apache.http.config, org.apache.http.conn.socket, org.apache.http.impl, org.apache.http.impl.client, org.apache.http.impl.conn, org.apache.http.impl.auth, org.apache.http.entity, org.apache.http.message, org.apache.http.params, org.apache.http.protocol, org.apache.http.auth, java.util.concurrent, org.apache.http.client.protocol, org.apache.http.conn.ssl, java.security.cert, java.security.spec, java.security, org.apache.http.client.utils; last error: cannot load Java class org.apache.http.client.utils.HttpClientBuilder"} [2017-10-17T17:13:05,969][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#<NameError: HttpClientBuilder not found in packages org.apache.http.client.methods, org.apache.http.client.entity, org.apache.http.client.config, org.apache.http.config, org.apache.http.conn.socket, org.apache.http.impl, org.apache.http.impl.client, org.apache.http.impl.conn, org.apache.http.impl.auth, org.apache.http.entity, org.apache.http.message, org.apache.http.params, org.apache.http.protocol, org.apache.http.auth, java.util.concurrent, org.apache.http.client.protocol, org.apache.http.conn.ssl, java.security.cert, java.security.spec, java.security, org.apache.http.client.utils; last error: cannot load Java class org.apache.http.client.utils.HttpClientBuilder>, :backtrace=>["file:/usr/share/logstash/vendor/jruby/lib/jruby.jar!/jruby/java/core_ext/module.rb:45:in const_missing'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.1-java/lib/manticore/client.rb:382:inclient_builder'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/manticore-0.6.1-java/lib/manticore/client.rb:180:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.0-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:26:ininitialize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:272:in build_adapter'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:276:inbuild_pool'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:60:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.0-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:101:increate_http_client'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.0-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:97:in build'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.0-java/lib/logstash/outputs/elasticsearch.rb:230:inbuild_client'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-7.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:24:in register'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:9:inregister'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:43:in register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290:inregister_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:301:in register_plugins'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:301:in register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:310:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:235:in run'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:398:instart_pipeline'"]} [2017-10-17T17:13:06,497][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2017-10-17T17:13:09,185][WARN ][logstash.agent ] stopping pipeline {:id=>"main"} [2017-10-17T17:13:09,188][ERROR][logstash.inputs.logstashinputazureblob] Oh My, An error occurred. undefined method break_blob_lease' for nil:NilClass: ["/usr/share/logstash/vendor/local_gems/adb1850b/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:320:inacquire_lease'", "/usr/share/logstash/vendor/local_gems/adb1850b/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:419:in cleanup_registry'", "/usr/share/logstash/vendor/local_gems/adb1850b/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:148:instop'", "/usr/share/logstash/logstash-core/lib/logstash/inputs/base.rb:89:in do_stop'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:515:in shutdown'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:426:instop_pipeline'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:442:in shutdown_pipelines'", "org/jruby/RubyHash.java:1342:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:442:in shutdown_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:139:inshutdown'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:328:in execute'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:67:inrun'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:204:in run'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:132:inrun'", "/usr/share/logstash/lib/bootstrap/environment.rb:71:in (root)'"] {:exception=>#<NoMethodError: undefined methodbreak_blob_lease' for nil:NilClass>}

xiaomi7732 commented 7 years ago

@02sandeepreddy Thanks for sending the logs. Now the problem seems in the elastic search output. To confirm, could you please comment out the elastic search output from your config file to see if the standard output works? I believe that would work. If not, please paste the new logs again.

02sandeepreddy commented 7 years ago

Hello @xiaomi7732 I did as per your suggestion(Commented out ES output) and here are logs

Config file :

input { azureblob { storage_account_name => "XXX" storage_access_key => "XXXX" container => "containername" codec => "line" } } output { stdout {codec => rubydebug} }

[root@ip-10-211-43-117 bin]# tailf /var/log/logstash/logstash-plain.log [2017-10-17T22:15:00,007][ERROR][logstash.pipeline ] Error registering plugin {:plugin=>"<LogStash::Inputs::LogstashInputAzureblob storage_account_name=>\"XXXXX\", storage_access_key=>\"XXXX\", container=>\"XX\", codec=><LogStash::Codecs::Line id=>\"line_f8a178e7-fdef-4a33-8f0c-87c8ee926c92\", enable_metric=>true, charset=>\"UTF-8\", delimiter=>\"\n\">, id=>\"afde19387b533f3560bf8e87b6c1cbcb18fd7965-1\", enable_metric=>true, endpoint=>\"core.windows.net\", registry_path=>\"data/registry\", registry_lease_duration=>15, interval=>30, registry_create_policy=>\"resume\", file_head_bytes=>0, file_tail_bytes=>0, blob_list_page_size=>100, file_chunk_size_bytes=>4194304>", :error=>"bind: name or service not known"} [2017-10-17T22:15:00,157][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>#, :backtrace=>["org/jruby/ext/socket/RubyUDPSocket.java:164:in bind'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:638:inbind_random_port'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:723:in initialize'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:707:in initialize'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:545:inmake_udp_requester'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:500:in each_resource'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/resolv.rb:480:ingetresource'", "/usr/share/logstash/vendor/jruby/lib/ruby/shared/rubygems/remote_fetcher.rb:92:in api_endpoint'", "/usr/share/logstash/vendor/jruby/lib/ruby/shared/rubygems/source.rb:46:inapi_uri'", "/usr/share/logstash/vendor/jruby/lib/ruby/shared/rubygems/source.rb:182:in load_specs'", "/usr/share/logstash/vendor/jruby/lib/ruby/shared/rubygems/spec_fetcher.rb:261:intuples_for'", "/usr/share/logstash/vendor/jruby/lib/ruby/shared/rubygems/spec_fetcher.rb:226:in available_specs'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/vendor/jruby/lib/ruby/shared/rubygems/source_list.rb:97:in each_source'", "/usr/share/logstash/vendor/jruby/lib/ruby/shared/rubygems/spec_fetcher.rb:222:inavailable_specs'", "/usr/share/logstash/vendor/jruby/lib/ruby/shared/rubygems/spec_fetcher.rb:102:in search_for_dependency'", "/usr/share/logstash/vendor/jruby/lib/ruby/shared/rubygems/spec_fetcher.rb:166:inspec_for_dependency'", "/usr/share/logstash/vendor/jruby/lib/ruby/shared/rubygems.rb:809:in latest_spec_for'", "/usr/share/logstash/vendor/local_gems/adb1850b/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:124:inregister'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290:in register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:301:inregister_plugins'", "org/jruby/RubyArray.java:1613:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:301:inregister_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:458:in start_inputs'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:348:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:235:in run'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:398:instart_pipeline'"]} [2017-10-17T22:15:00,288][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2017-10-17T22:15:03,196][WARN ][logstash.agent ] stopping pipeline {:id=>"main"} [2017-10-17T22:15:03,218][ERROR][logstash.inputs.logstashinputazureblob] Oh My, An error occurred. undefined method break_blob_lease' for nil:NilClass: ["/usr/share/logstash/vendor/local_gems/adb1850b/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:320:inacquire_lease'", "/usr/share/logstash/vendor/local_gems/adb1850b/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:419:in cleanup_registry'", "/usr/share/logstash/vendor/local_gems/adb1850b/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:148:instop'", "/usr/share/logstash/logstash-core/lib/logstash/inputs/base.rb:89:in do_stop'", "org/jruby/RubyArray.java:1613:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:515:in shutdown'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:426:instop_pipeline'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:442:in shutdown_pipelines'", "org/jruby/RubyHash.java:1342:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:442:in shutdown_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:139:inshutdown'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:328:in execute'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:67:inrun'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:204:in run'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:132:inrun'", "/usr/share/logstash/lib/bootstrap/environment.rb:71:in (root)'"] {:exception=>#<NoMethodError: undefined methodbreak_blob_lease' for nil:NilClass>} [2017-10-17T22:15:11,168][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"} [2017-10-17T22:15:11,171][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"} [2017-10-17T22:15:11,660][ERROR][logstash.plugins.registry] Problems loading a plugin with {:type=>"input", :name=>"azureblob", :path=>"logstash/inputs/azureblob", :error_message=>"uninitialized constant JSON::Ext::Parser", :error_class=>NameError, :error_backtrace=>["org/jruby/RubyModule.java:2746:in const_missing'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/json-1.8.6-java/lib/json/ext.rb:16:inExt'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/json-1.8.6-java/lib/json/ext.rb:12:in JSON'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/json-1.8.6-java/lib/json/ext.rb:9:in(root)'", "org/jruby/RubyKernel.java:1040:in require'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/polyglot-0.3.5/lib/polyglot.rb:65:inrequire'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/json-1.8.6-java/lib/json.rb:1:in (root)'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/json-1.8.6-java/lib/json.rb:58:inJSON'", "org/jruby/RubyKernel.java:1040:in require'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/polyglot-0.3.5/lib/polyglot.rb:65:inrequire'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/json-1.8.6-java/lib/json.rb:54:in (root)'", "org/jruby/RubyKernel.java:1040:inrequire'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/polyglot-0.3.5/lib/polyglot.rb:65:in require'", "/usr/share/logstash/vendor/local_gems/adb1850b/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:1:in(root)'", "/usr/share/logstash/vendor/local_gems/adb1850b/logstash-input-azureblob-0.9.12-java/lib/logstash/inputs/azureblob.rb:7:in (root)'", "/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:1:in(root)'", "/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:156:in legacy_lookup'", "/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:138:inlookup'", "/usr/share/logstash/logstash-core/lib/logstash/plugins/registry.rb:180:in lookup_pipeline_plugin'", "org/jruby/RubyKernel.java:1079:ineval'", "/usr/share/logstash/logstash-core/lib/logstash/plugin.rb:140:in lookup'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:103:inplugin'", "(eval):8:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:75:ininitialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:165:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:286:increate_pipeline'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:95:in register_pipeline'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:313:inexecute'", "/usr/share/logstash/lib/bootstrap/environment.rb:71:in (root)'"]} [2017-10-17T22:15:11,682][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Couldn't find any input plugin named 'azureblob'. Are you sure this is correct? Trying to load the azureblob input plugin resulted in this error: Problems loading the requested plugin named azureblob of type input. Error: NameError uninitialized constant JSON::Ext::Parser"} [2017-10-17T22:15:19,698][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"} [2017-10-17T22:15:19,700][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"} [2017-10-17T22:15:20,226][ERROR][logstash.plugins.registry] Problems loading a plugin with {:type=>"input", :name=>"azureblob", :path=>"logstash/inputs/azureblob", :error_message=>"uninitialized constant JSON::Ext::Generator", :error_class=>NameError, :error_backtrace=>["org/jruby/RubyModule.java:2746:inconst_missing'",

xiaomi7732 commented 7 years ago

@02sandeepreddy , Thanks for the logs. By reading the logs, it seems that the udp port is taken and logstash / jruby is not handling the case correctly by choosing another random port. This is beyond the control of the azureblob input plugin.

It seems like this issue: https://github.com/elastic/logstash/issues/1587, however, it should have been fixed.

At this point, I would suggest you open a ticket here: https://discuss.elastic.co/c/logstash, to get some help from elastic.

02sandeepreddy commented 7 years ago

@xiaomi7732
Thanks for your time . Im going to do that now

xiaomi7732 commented 7 years ago

Close this one since there's no action item from plug-in end. Feel free to re-open it if there's follow up on this issue.