Azure / azure-diagnostics-tools

Plugins and tools for collecting, processing, managing, and visualizing diagnostics data and configuration
98 stars 94 forks source link

NoMethodError: undefined method `length? #217

Open JulianCeaser opened 4 years ago

JulianCeaser commented 4 years ago

In the last few days getting this error when using the logstash_input_azureblob plugin. Verified the credentials for the storage account so that is fine.

[2020-03-27T12:36:51,429][ERROR][logstash.inputs.logstashinputazureblob] Oh My, An error occurred. Error:undefined methodlength?' for #: Trace: ["/usr/local/xxxxxx/logstash-7.2.0-xxxxxx/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:210:in process'", "/usr/local/xxxxxx/logstash-7.2.0-xxxxxx/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:151:inrun'", "/usr/local/xxxxxx/logstash-7.2.0-xxxxxx/logstash-core/lib/logstash/java_pipeline.rb:309:in inputworker'", "/usr/local/xxxxxx/logstash-7.2.0-xxxxxx/logstash-core/lib/logstash/java_pipeline.rb:302:inblock in start_input'"] {:exception=>#<NoMethodError: undefined method length?' for #<String:0x1bc31a92>>}

Can someone please help me understand what is the issue ?

pinochioze commented 4 years ago

Hi Julian, May I have your architecture with your issue, because at the first think, the plugin cannot read the blob in the Azure Storage and then there is no content to get and it can get the length of content. So you should take a look on your configuration first. or you can share here

chrismon commented 4 years ago

I'm running into this problem as well.. data flows from the input for a while and then this exception is hit, eventually leading to a jvm heap exhaustion and restart of logstash. Note that I am using the plain codec in the input plugin, then cleaning up the string in the filter before converting it to json. This is because Azure NSG Flow Logs are adding some garbage in the json file that was causing the json codec to fail.... anyway. here is the exception I am hitting... I believe you just need to check and see if the object is null before attempt calling .length...

[2020-05-17T23:06:21,558][ERROR][logstash.inputs.logstashinputazureblob][main][345b32f7914980f8f6b9f62d4c667c525c3a47ff204619d5c238b29665106bfa] Oh My, An error occurred. Error:undefined methodlength' for nil:NilClass: Trace: ["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:210:in process'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:151:inrun'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:346:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:337:inblock in start_input'"] {:exception=>#<NoMethodError: undefined method length' for nil:NilClass>}

pacrutchet commented 2 years ago

Hi,

I have the same issue when all the flow logs have been ingested I get the follow error: [2021-11-17T19:32:16,534][ERROR][logstash.inputs.logstashinputazureblob][main][46dda2069656930a708a460620ac57fc3815ab0d4f59c2bffbc7f8c7f0e3b96a] Oh My, An error occurred. Error:undefined methodlength' for nil:NilClass: Trace: ["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:210:in process'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:151:inrun'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:405:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:396:inblock in start_input'"] {:exception=>#<NoMethodError: undefined method length' for nil:NilClass>}

Is there any fix for this ?

Thanks! PA

jasonrepos commented 2 years ago

Did anyone manage to solve this, I get the same issue when parsing NSG flow logs. I cannot seem to see any difference in the json file (PT1H.json) within the storage container.

[DEBUG] 2022-02-27 15:42:45.791 [logstash-pipeline-flush] PeriodicFlush - Pushing flush onto pipeline. [DEBUG] 2022-02-27 15:42:46.577 [[main]<azureblob] logstashinputazureblob - Processing blob resourceId=/SUBSCRIPTIONS//RESOURCEGROUPS//PROVIDERS/MICROSOFT.NETWORK/NETWORKSECURITYGROUPS//y=2022/m=02/d=27/h=03/m=00/macAddress=/PT1H.json [DEBUG] 2022-02-27 15:42:47.002 [[main]<azureblob] logstashinputazureblob - start index: 105130 blob size: 105132 [DEBUG] 2022-02-27 15:42:47.022 [[main]<azureblob] logstashinputazureblob - New registry offset: 105130 [ERROR] 2022-02-27 15:42:48.942 [[main]<azureblob] logstashinputazureblob - Oh My, An error occurred. Error:undefined method length' for nil:NilClass: Trace: ["/usr/share/logstash/vendor/local_gems/7a9244fa/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:212:inprocess'", "/usr/share/logstash/vendor/local_gems/7a9244fa/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:153:in run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:409:ininputworker'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:400:in block in start_input'"] {:exception=>#<NoMethodError: undefined methodlength' for nil:NilClass>} [DEBUG] 2022-02-27 15:42:48.944 [[main]<azureblob] logstashinputazureblob - Hitting interval of 30s . . . [DEBUG] 2022-02-27 15:42:49.308 [pool-6-thread-1] jvm - collector name {:name=>"ParNew"} [DEBUG] 2022-02-27 15:42:49.308 [pool-6-thread-1] jvm - collector name {:name=>"ConcurrentMarkSweep"}

Below is my logstash config

input { azureblob { storage_account_name => "" storage_access_key => "" container => "insights-logs-networksecuritygroupflowevent" codec => "json"

Refer https://docs.microsoft.com/azure/network-watcher/network-watcher-read-nsg-flow-logs

     # Typical numbers could be 21/9 or 12/2 depends on the nsg log file types
     file_head_bytes => 12
     file_tail_bytes => 2
     # Enable / tweak these settings when event is too big for codec to handle.
     # break_json_down_policy => "with_head_tail"
     # break_json_batch_count => 2
}

}