Closed codyja closed 7 years ago
Nordes fork is the way to go with blobs that are appended to. The initial version of the plugin on this git will skip any file that have an associated lock file. Do you still have the errors when you tried the fork ?
Hi, I seem to be having an issue using the Nordes fork. After I install it, I start getting errors in the logstash log file about issues creating pipeline:
[ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Couldn't find any input plugin named 'azureblob'. Are you sure this is correct? Trying to load the azureblob input plugin resulted in this error: Problems loading the requested plugin named azureblob of type input. Error: NameError NameError"}
@codyja, @anuraza, How is it going? Would you mind try out logstash-input-azureblobs 0.9.8? That should address your requirement. (https://rubygems.org/gems/logstash-input-azureblob)
@xiaomi7732 the changes break reading of NSG Flow Logs. Incremental reads on a blob end up passing malformed JSON to logstash, causing parsing issues and preventing log ingestion. Could you please reopen the issue?
Hi @xiaomi7732, I seem to get that same error mattreatMSFT encounters. Let me know if I can help test anything, I have my environment setup and am collecting NSG flow logs currently. Here's a piece of the logstash log
[2017-08-23T20:33:53,906][WARN ][logstash.filters.split ] Only String and Array types are splittable. field:[records] is of type = NilClass
[2017-08-23T20:33:53,908][WARN ][logstash.filters.split ] Only String and Array types are splittable. field:[records][properties][flows] is of type = NilClass
[2017-08-23T20:33:53,908][WARN ][logstash.filters.split ] Only String and Array types are splittable. field:[records][properties][flows][flows] is of type = NilClass
[2017-08-23T20:33:53,909][WARN ][logstash.filters.split ] Only String and Array types are splittable. field:[records][properties][flows][flows][flowTuples] is of type = NilClass
@codyja, Thanks for your contacting. The fixing is on the way. Checkout #83. If you are interested in the early bits, you could build the gem off the tip of that branch.
Please refer the examples on this readme to setup the filter. Search for 'NSG Logs' on the page:
Hi @xiaomi7732, I believe I built a gem off your json-head-tail branch (version 0.9.9), and updated my logstash.conf, but still seem to be getting those same error messages.
root@6f866f0d4bf4:/opt/logstash# gosu logstash bin/logstash-plugin list --verbose logstash-input-azureblob-json-head-tail
logstash-input-azureblob-json-head-tail (0.9.9)
@codyja, Thank you for the reply.
@mattreatMSFT, @codyja, The fix is released: https://rubygems.org/gems/logstash-input-azureblob/versions/0.9.9 Please refer examples in README.md for proper configurations.
@xiaomi7732 I've been testing today, and it looks to be working like a champ! NSG flow logs seem to be updating in my dashboard constantly, no errors at all. Thanks very much for this fix!
@codyja, Thank you so much for the quick-turnaround to confirm it works. I am glad to hear that. Keep up feel free to open issues here if you see any.
@codyja just checking if you're still using this for NSG Flow logs? if so did you have to do any updates to the logstash filter take into account any changes in the NSG flow log format?
Any thoughts would be great
Thanks
Hass
Hi Everyone , How to do Azure Firewall logs to ELK . I am found for Azure app gateway logs
Hello,
Is there any update for this issue? I am using logstash-inpuc-azureblob 0.9.13 to collect NSG flow logs follow this link: https://github.com/Azure/azure-diagnostics-tools/tree/master/Logstash/logstash-input-azureblob
But the same issue when logstash cannot parse NSG flow records with the following errors: [WARN ] 2020-07-22 06:35:39.664 [[main]>worker1] split - Only String and Array types are splittable. field:[records] is of type = NilClass [WARN ] 2020-07-22 06:35:39.664 [[main]>worker1] split - Only String and Array types are splittable. field:[records][properties][flows] is of type = NilClass [WARN ] 2020-07-22 06:35:39.664 [[main]>worker1] split - Only String and Array types are splittable. field:[records][properties][flows][flows] is of type = NilClass [WARN ] 2020-07-22 06:35:39.664 [[main]>worker1] split - Only String and Array types are splittable. field:[records][properties][flows][flows][flowTuples] is of type = NilClass
Hi, I'm seeing what I beleive is an issue with the logstash-input-azureblob plugin regarding Azure's NSG flow logs (created by Network Watcher). The flow logs are written to a log file continuiosly for one hour, then a new hour directory is created, and the process continues. What I think is happening is the logstash-input-azureblob plugin will discover the new log file that is created every hour (with a minute or so of it being created), send it to say elasticsearch, then it drops a .LOCK file in that directory so it knows it's already read the file. The problem is that Azure continues to write to the log file for the next hour (I believe every minute), so the plugin isn't getting all the data. This means if elastic search is running, it's getting just a piece of the logs.
I tested the Nordes fork which appears to write state to an Azure Table, but it gave additional syntax errors interpreting the actual flow logs.