Closed huqianghui closed 1 year ago
Good question, I am running 0.12.2 version and the logstash-plugin install does not seem to update to the latest version of azure_blob_storage. I also need a way to install the latest version of the plugin.
I found out here is how you update the plugin bin/logstash-plugin update logstash-input-azure_blob_storage
I got the plugin updated to 0.12.9 using it.
Thansk Julian for chipping in. I appreciate it.
0.9.6 is the version of the plugin named azureblob, I learned a lot from that plugin , but I struggled with the blob lease locking. I rewrote it as logstash-input-azure_blob_storage. The configuration is different.
Logstash plugins are documented here https://www.elastic.co/guide/en/logstash/current/working-with-plugins.html#listing-plugins
For Ubuntu I use these commands, to list, update, remove and install sudo -u logstash /usr/share/logstash/bin/logstash-plugin list --verbose sudo -u logstash /usr/share/logstash/bin/logstash-plugin update sudo -u logstash /usr/share/logstash/bin/logstash-plugin update logstash-input-azure_blob_storage sudo -u logstash /usr/share/logstash/bin/logstash-plugin remove logstash-input-azurestorage sudo -u logstash /usr/share/logstash/bin/logstash-plugin install logstash-input-azure_blob_storage
The install should download the plugin gem from rubygems, because I push every new release. https://rubygems.org/gems/logstash-input-azure_blob_storage
But I use myself the build.sh script to build and install the gem locally. This you don't have to do, unless you want to modify the code in lib/logstash/inputs/azure_blob_storage.rb sudo -u logstash gem build logstash-input-azure_blob_storage.gemspec sudo -u logstash gem install logstash-input-azure_blob_storage-${VERSION}.gem sudo -u logstash /usr/share/logstash/bin/logstash-plugin install ${GEMPWD}/logstash-input-azure_blob_storage-${VERSION}.gem
Thanks both @JulianCeaser & @janmg.
Could share me with your logstash config sample of this plugin and the logstash and ubutun’ s version ?
Because I followed your up guide, I just could update to 0.11.X version, It cannot upgrade to 0.12.7.
And My logstash‘ version is 0.6.8. This version can install and update the plugin successfully.
The other version like 0.7.X or 0.8.X could install or update the plugin successfully in my enviroment.
The Error message as below:
warning: thread "[main]>worker4" terminated with exception (report_on_exception is true):
NoMethodError: undefined method shutdown_requested?' for #<LogStash::Pipeline:0x2f02daaf> pipeline_shutdown_requested? at /eastmoney/logstash-6.8.6/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.15.9-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:380 wait_for_successful_connection at /eastmoney/logstash-6.8.6/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.15.9-java/lib/logstash/outputs/elasticsearch.rb:426 multi_receive at /eastmoney/logstash-6.8.6/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.15.9-java/lib/logstash/outputs/elasticsearch.rb:376 multi_receive at org/logstash/config/ir/compiler/OutputStrategyExt.java:118 multi_receive at org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101 output_batch at /eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:390 each at org/jruby/RubyHash.java:1419 output_batch at /eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:389 worker_loop at /eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:341 start_workers at /eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:304 [2023-07-31T16:50:07,806][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<NoMethodError: undefined method
shutdown_requested?' for #pipeline_shutdown_requested?'", "/eastmoney/logstash-6.8.6/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.15.9-java/lib/logstash/outputs/elasticsearch.rb:426:in
wait_for_successful_connection'", "/eastmoney/logstash-6.8.6/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.15.9-java/lib/logstash/outputs/elasticsearch.rb:376:in multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in
multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in multi_receive'", "/eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:390:in
block in output_batch'", "org/jruby/RubyHash.java:1419:in each'", "/eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:389:in
output_batch'", "/eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:341:in worker_loop'", "/eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:304:in
block in start_workers'"]}
[2023-07-31T16:50:07,823][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
warning: thread "[main]>worker1" terminated with exception (report_on_exception is true):
NoMethodError: undefined method `pop' for nil:NilClass
awesome at /eastmoney/logstash-6.8.6/vendor/bundle/jruby/2.5.0/gems/amazing_print-1.5.0/lib/amazing_print/inspector.rb:93
ai at /eastmoney/logstash-6.8.6/vendor/bundle/jruby/2.5.0/gems/amazing_print-1.5.0/lib/amazing_print/core_ext/kernel.rb:11
encode_default at /eastmoney/logstash-6.8.6/vendor/bundle/jruby/2.5.0/gems/logstash-codec-rubydebug-3.1.0/lib/logstash/codecs/rubydebug.rb:38
call at org/jruby/RubyMethod.java:120
encode at /eastmoney/logstash-6.8.6/vendor/bundle/jruby/2.5.0/gems/logstash-codec-rubydebug-3.1.0/lib/logstash/codecs/rubydebug.rb:34
multi_encode at /eastmoney/logstash-6.8.6/logstash-core/lib/logstash/codecs/base.rb:48
each at org/jruby/RubyArray.java:1792
multi_encode at /eastmoney/logstash-6.8.6/logstash-core/lib/logstash/codecs/base.rb:48
multi_receive at /eastmoney/logstash-6.8.6/logstash-core/lib/logstash/outputs/base.rb:87
multi_receive at org/logstash/config/ir/compiler/OutputStrategyExt.java:118
multi_receive at org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101
output_batch at /eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:390
each at org/jruby/RubyHash.java:1419
output_batch at /eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:389
at :1
start_workers at /eastmoney/logstash-6.8.6/logstash-core/lib/logstash/pipeline.rb:304
Exception in thread "[main]>worker1" java.lang.NullPointerException
at org.jruby.internal.runtime.ThreadService.getMainThread(ThreadService.java:231)
at org.jruby.RubyThread.exceptionRaised(RubyThread.java:1792)
at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:112)
at java.base/java.lang.Thread.run(Thread.java:829)
My config file as below:
input { azure_blob_storage { connection_string => "DefaultEndpointsProtocol=https;AccountName=netflowaci4pocdiag;AccountKey=XXXXXXXXXXXXXXXX;EndpointSuffix=core.chinacloudapi.cn" container => "insights-logs-networksecuritygroupflowevent" codec => "json"
# Refer https://learn.microsoft.com/azure/network-watcher/network-watcher-read-nsg-flow-logs
# Typical numbers could be 21/9 or 12/2 depends on the nsg log file types
# file_head_bytes => 12
# file_tail_bytes => 2
# Enable / tweak these settings when event is too big for codec to handle.
# break_json_down_policy => "with_head_tail"
# break_json_batch_count => 2
}
}
filter { split { field => "[records]" } split { field => "[records][properties][flows]"} split { field => "[records][properties][flows][flows]"} split { field => "[records][properties][flows][flows][flowTuples]"}
#split { field => "[records][properties][flows][flows][flowTuples][10]"}
#split { field => "[records][properties][flows][flows][flowTuples][11]"}
#split { field => "[records][properties][flows][flows][flowTuples][12]"}
mutate{
split => { "[records][resourceId]" => "/"}
add_field => {"Subscription" => "%{[records][resourceId][2]}"
"ResourceGroup" => "%{[records][resourceId][4]}"
"NetworkSecurityGroup" => "%{[records][resourceId][8]}"}
convert => {"Subscription" => "string"}
convert => {"ResourceGroup" => "string"}
convert => {"NetworkSecurityGroup" => "string"}
split => { "[records][properties][flows][flows][flowTuples]" => ","}
add_field => {
"unixtimestamp" => "%{[records][properties][flows][flows][flowTuples][0]}"
"srcIp" => "%{[records][properties][flows][flows][flowTuples][1]}"
"destIp" => "%{[records][properties][flows][flows][flowTuples][2]}"
"srcPort" => "%{[records][properties][flows][flows][flowTuples][3]}"
"destPort" => "%{[records][properties][flows][flows][flowTuples][4]}"
"protocol" => "%{[records][properties][flows][flows][flowTuples][5]}"
"trafficflow" => "%{[records][properties][flows][flows][flowTuples][6]}"
"traffic" => "%{[records][properties][flows][flows][flowTuples][7]}"
"flowstate" => "%{[records][properties][flows][flows][flowTuples][8]}"
"packetsSourceToDest" => "%{[records][properties][flows][flows][flowTuples][9]}"
"bytesSentSourceToDest" => "%{[records][properties][flows][flows][flowTuples][10]}"
"packetsDestToSource" => "%{[records][properties][flows][flows][flowTuples][11]}"
"bytesSentDestToSource" => "%{[records][properties][flows][flows][flowTuples][12]}"
}
convert => {"unixtimestamp" => "integer"}
convert => {"srcPort" => "integer"}
convert => {"destPort" => "integer"}
}
date{ match => ["unixtimestamp" , "UNIX"] timezone => "Asia/Shanghai" target => "unixtimestamp_utc8" } } output { stdout { codec => rubydebug } elasticsearch { index => "vpc-flowlogs-%{+YYYY.MM.dd}" hosts => ["1.1.1.1:111"] user=> "xxxx" password=> "xxxxx" } }
The error message says there is a problem with the output codec rubydebug and it looks like it can't handle a multi_receive. There are also a missing shutdown_requested? which is something inside logstash. I don't think my input codec directly caused this. Logstash 6.8.6 is from December 2019 and I would use logstash 7 or 8. There was a gem conflict that was fixed in Logstash 7, where the Faraday version Logstash was finally updated, so that I could update azure-storage-common.
You don't have to use split and mutate in the filter stanza if you set logtype => "nsgflowlog", then the parsing is already done in the plugin. I did this because nsgflowlogs where my primary target and I already have all the json loaded inside memory, then splitting inside the plugin made most sense. The configuration examples can be found in the README.md, although I see that I messed up the formatting. This version is still readable, look for "A more elaborate input configuration example" README.md
the logstash-input-azure_blob_storage did not change.
But the 0.9.6 version does not support connectionstring field, I have to use the 0.12.7 version.
Cloud anybody tell me how to install or update logstash-input-azure_blob_storage plugin to the 0.12.7 version?