Open JulianCeaser opened 4 years ago
Hi Julian, May I have your architecture with your issue, because at the first think, the plugin cannot read the blob in the Azure Storage and then there is no content to get and it can get the length of content. So you should take a look on your configuration first. or you can share here
I'm running into this problem as well.. data flows from the input for a while and then this exception is hit, eventually leading to a jvm heap exhaustion and restart of logstash. Note that I am using the plain codec in the input plugin, then cleaning up the string in the filter before converting it to json. This is because Azure NSG Flow Logs are adding some garbage in the json file that was causing the json codec to fail.... anyway. here is the exception I am hitting... I believe you just need to check and see if the object is null before attempt calling .length...
[2020-05-17T23:06:21,558][ERROR][logstash.inputs.logstashinputazureblob][main][345b32f7914980f8f6b9f62d4c667c525c3a47ff204619d5c238b29665106bfa] Oh My, An error occurred. Error:undefined method
length' for nil:NilClass: Trace: ["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:210:in process'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:151:in
run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:346:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:337:in
block in start_input'"] {:exception=>#<NoMethodError: undefined method length' for nil:NilClass>}
Hi,
I have the same issue when all the flow logs have been ingested I get the follow error:
[2021-11-17T19:32:16,534][ERROR][logstash.inputs.logstashinputazureblob][main][46dda2069656930a708a460620ac57fc3815ab0d4f59c2bffbc7f8c7f0e3b96a] Oh My, An error occurred. Error:undefined method
length' for nil:NilClass: Trace: ["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:210:in process'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:151:in
run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:405:in inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:396:in
block in start_input'"] {:exception=>#<NoMethodError: undefined method length' for nil:NilClass>}
Is there any fix for this ?
Thanks! PA
Did anyone manage to solve this, I get the same issue when parsing NSG flow logs. I cannot seem to see any difference in the json file (PT1H.json) within the storage container.
[DEBUG] 2022-02-27 15:42:45.791 [logstash-pipeline-flush] PeriodicFlush - Pushing flush onto pipeline.
[DEBUG] 2022-02-27 15:42:46.577 [[main]<azureblob] logstashinputazureblob - Processing blob resourceId=/SUBSCRIPTIONS/length' for nil:NilClass: Trace: ["/usr/share/logstash/vendor/local_gems/7a9244fa/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:212:in
process'", "/usr/share/logstash/vendor/local_gems/7a9244fa/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:153:in run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:409:in
inputworker'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:400:in block in start_input'"] {:exception=>#<NoMethodError: undefined method
length' for nil:NilClass>}
[DEBUG] 2022-02-27 15:42:48.944 [[main]<azureblob] logstashinputazureblob - Hitting interval of 30s . . .
[DEBUG] 2022-02-27 15:42:49.308 [pool-6-thread-1] jvm - collector name {:name=>"ParNew"}
[DEBUG] 2022-02-27 15:42:49.308 [pool-6-thread-1] jvm - collector name {:name=>"ConcurrentMarkSweep"}
Below is my logstash config
input {
azureblob
{
storage_account_name => "
# Typical numbers could be 21/9 or 12/2 depends on the nsg log file types
file_head_bytes => 12
file_tail_bytes => 2
# Enable / tweak these settings when event is too big for codec to handle.
# break_json_down_policy => "with_head_tail"
# break_json_batch_count => 2
}
}
In the last few days getting this error when using the logstash_input_azureblob plugin. Verified the credentials for the storage account so that is fine.
[2020-03-27T12:36:51,429][ERROR][logstash.inputs.logstashinputazureblob] Oh My, An error occurred. Error:undefined method
length?' for #process'", "/usr/local/xxxxxx/logstash-7.2.0-xxxxxx/vendor/bundle/jruby/2.5.0/gems/logstash-input-azureblob-0.9.13-java/lib/logstash/inputs/azureblob.rb:151:in
run'", "/usr/local/xxxxxx/logstash-7.2.0-xxxxxx/logstash-core/lib/logstash/java_pipeline.rb:309:ininputworker'", "/usr/local/xxxxxx/logstash-7.2.0-xxxxxx/logstash-core/lib/logstash/java_pipeline.rb:302:in
block in start_input'"] {:exception=>#<NoMethodError: undefined methodlength?' for #<String:0x1bc31a92>>}
Can someone please help me understand what is the issue ?