Closed shashim007 closed 6 years ago
The default policy is to consider all existing blobs as parsed, so I suspect that logstash is waiting for new content to be added to blobs or for new blobs to be created.
You can adjust this behavior with : registry_create_policy => "start_over"
You'll need to delete the registry for this to take effect.
@EmFl Thanks , this worked.
Used this config,
input { azureblob { storage_account_name => "storage_account_name" storage_access_key => "storage_key" container => "container_name"
} }
output { file { path => '/tmp/output.txt' codec => rubydebug } stdout { codec => rubydebug } }
It throws this to stdout: ../bin/logstash -f azureblob-to-file.conf --debug 2017-12-11 19:26:31,008 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root" 2017-12-11 19:26:31,009 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root" 2017-12-11 19:26:31,010 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog" 2017-12-11 19:26:31,011 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog" 2017-12-11 19:26:33,494 main ERROR Unable to locate appender "${sys:ls.log.format}_console" for logger config "root" 2017-12-11 19:26:33,495 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling" for logger config "root" 2017-12-11 19:26:33,496 main ERROR Unable to locate appender "${sys:ls.log.format}_rolling_slowlog" for logger config "slowlog" 2017-12-11 19:26:33,496 main ERROR Unable to locate appender "${sys:ls.log.format}_console_slowlog" for logger config "slowlog" Sending Logstash's logs to /home/jciazdeploy/logstash-6.0.0/logs which is now configured via log4j2.properties [2017-12-11T19:26:33,666][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/home/jciazdeploy/logstash-6.0.0/modules/fb_apache/configuration"} [2017-12-11T19:26:33,672][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x62ba655a @module_name="fb_apache", @directory="/home/jciazdeploy/logstash-6.0.0/modules/fb_apache/configuration", @kibana_version_parts=["6", "0", "0"]>} [2017-12-11T19:26:33,674][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/home/jciazdeploy/logstash-6.0.0/modules/netflow/configuration"} [2017-12-11T19:26:33,674][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x659bbbae @module_name="netflow", @directory="/home/jciazdeploy/logstash-6.0.0/modules/netflow/configuration", @kibana_version_parts=["6", "0", "0"]>} [2017-12-11T19:26:33,975][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"/home/jciazdeploy/logstash-6.0.0/config/pipelines.yml"} [2017-12-11T19:26:33,977][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2017-12-11T19:26:34,007][DEBUG][logstash.agent ] Agent: Configuring metric collection [2017-12-11T19:26:34,015][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120} [2017-12-11T19:26:34,063][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120} [2017-12-11T19:26:34,128][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120} [2017-12-11T19:26:34,131][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120} [2017-12-11T19:26:34,148][DEBUG][logstash.agent ] starting agent [2017-12-11T19:26:34,181][DEBUG][logstash.agent ] Starting puma [2017-12-11T19:26:34,184][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["/home/jciazdeploy/logstash-6.0.0/config//home/jciazdeploy/logstash-6.0.0/logs", "/home/jciazdeploy/logstash-6.0.0/config/azure-logzio", "/home/jciazdeploy/logstash-6.0.0/config/azureblob-to-logzio.conf", "/home/jciazdeploy/logstash-6.0.0/config/file-to-logz.io", "/home/jciazdeploy/logstash-6.0.0/config/file-to-stdout.conf", "/home/jciazdeploy/logstash-6.0.0/config/java_pid10980.hprof", "/home/jciazdeploy/logstash-6.0.0/config/java_pid2038.hprof", "/home/jciazdeploy/logstash-6.0.0/config/java_pid2919.hprof", "/home/jciazdeploy/logstash-6.0.0/config/jvm.options", "/home/jciazdeploy/logstash-6.0.0/config/log4j2.properties", "/home/jciazdeploy/logstash-6.0.0/config/logstash-simple.conf", "/home/jciazdeploy/logstash-6.0.0/config/logstash.yml", "/home/jciazdeploy/logstash-6.0.0/config/pipelines.yml", "/home/jciazdeploy/logstash-6.0.0/config/ssl_azureblob", "/home/jciazdeploy/logstash-6.0.0/config/startup.options", "/home/jciazdeploy/logstash-6.0.0/config/stdin-to-file"]} [2017-12-11T19:26:34,186][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600} [2017-12-11T19:26:34,188][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/home/jciazdeploy/logstash-6.0.0/config/azureblob-to-file.conf"} [2017-12-11T19:26:34,221][DEBUG][logstash.api.service ] [api-service] start [2017-12-11T19:26:34,234][DEBUG][logstash.agent ] Converging pipelines [2017-12-11T19:26:34,241][DEBUG][logstash.agent ] Needed actions to converge {:actions_count=>1} [2017-12-11T19:26:34,244][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main} [2017-12-11T19:26:34,324][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2017-12-11T19:26:36,143][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"azureblob", :type=>"input", :class=>LogStash::Inputs::LogstashInputAzureblob} [2017-12-11T19:26:36,176][INFO ][logstash.inputs.logstashinputazureblob] Using version 0.9.x input plugin 'azureblob'. This plugin should work but would benefit from use by folks like you. Please let us know if you find bugs or have suggestions on how to improve this plugin. [2017-12-11T19:26:36,196][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"json_lines", :type=>"codec", :class=>LogStash::Codecs::JSONLines} [2017-12-11T19:26:36,205][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@id = "json_lines_84d648a0-1c02-4032-8fc2-2f2222f7638c" [2017-12-11T19:26:36,207][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@enable_metric = true [2017-12-11T19:26:36,207][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@charset = "UTF-8" [2017-12-11T19:26:36,208][DEBUG][logstash.codecs.jsonlines] config LogStash::Codecs::JSONLines/@delimiter = "\n" [2017-12-11T19:26:36,219][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@storage_account_name = "datauimssa" [2017-12-11T19:26:36,220][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@storage_access_key = "wqVLoG2b+MIqUHhOwdyx6trYN1lmqouiFcryj9iB61ogpMCO+DWMYx3X5ywm6Yq/jVTvnGNf6XPzhCr72KKAhQ==" [2017-12-11T19:26:36,220][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@container = "serverlogs" [2017-12-11T19:26:36,222][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@id = "784a1c43a656f5f42c6d21a4afb2d4b655af8e8609b05af514e1c5f8d45f0f5f" [2017-12-11T19:26:36,224][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@enable_metric = true [2017-12-11T19:26:36,226][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@codec = <LogStash::Codecs::JSONLines id=>"json_lines_84d648a0-1c02-4032-8fc2-2f2222f7638c", enable_metric=>true, charset=>"UTF-8", delimiter=>"\n"> [2017-12-11T19:26:36,226][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@add_field = {} [2017-12-11T19:26:36,227][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@endpoint = "core.windows.net" [2017-12-11T19:26:36,228][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@registry_path = "data/registry" [2017-12-11T19:26:36,228][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@registry_lease_duration = 15 [2017-12-11T19:26:36,228][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@interval = 30 [2017-12-11T19:26:36,229][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@registry_create_policy = "resume" [2017-12-11T19:26:36,229][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@file_head_bytes = 0 [2017-12-11T19:26:36,229][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@file_tail_bytes = 0 [2017-12-11T19:26:36,230][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@blob_list_page_size = 100 [2017-12-11T19:26:36,230][DEBUG][logstash.inputs.logstashinputazureblob] config LogStash::Inputs::LogstashInputAzureblob/@file_chunk_size_bytes = 4194304 [2017-12-11T19:26:36,252][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"file", :type=>"output", :class=>LogStash::Outputs::File} [2017-12-11T19:26:36,281][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug} [2017-12-11T19:26:36,286][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_5218548d-28e7-44b9-bfb4-423b683281cc" [2017-12-11T19:26:36,286][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@enable_metric = true [2017-12-11T19:26:36,287][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@metadata = false [2017-12-11T19:26:36,502][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@path = "/tmp/output-test.txt" [2017-12-11T19:26:36,503][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_5218548d-28e7-44b9-bfb4-423b683281cc", enable_metric=>true, metadata=>false> [2017-12-11T19:26:36,503][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@id = "59b92181a97d956c8a8f5b360dcfc8920cc33e68022b35d45f8e3e1bc7951d08" [2017-12-11T19:26:36,503][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@enable_metric = true [2017-12-11T19:26:36,504][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@workers = 1 [2017-12-11T19:26:36,504][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@flush_interval = 2 [2017-12-11T19:26:36,504][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@gzip = false [2017-12-11T19:26:36,505][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@filename_failure = "_filepath_failures" [2017-12-11T19:26:36,505][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@create_if_deleted = true [2017-12-11T19:26:36,506][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@dir_mode = -1 [2017-12-11T19:26:36,507][DEBUG][logstash.outputs.file ] config LogStash::Outputs::File/@file_mode = -1 [2017-12-11T19:26:36,517][DEBUG][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main"} [2017-12-11T19:26:36,526][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250, :thread=>"#<Thread:0x22eaaf58@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 run>"} [2017-12-11T19:26:39,585][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>"main"} [2017-12-11T19:26:39,592][DEBUG][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x22eaaf58@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"} [2017-12-11T19:26:39,637][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]} ^C[2017-12-11T19:26:41,902][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Stopping [2017-12-11T19:26:41,904][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Stopping [2017-12-11T19:26:41,909][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Stopping [2017-12-11T19:26:41,910][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] PeriodicPoller: Stopping [2017-12-11T19:26:41,944][DEBUG][logstash.agent ] Shutting down all pipelines {:pipelines_count=>1} [2017-12-11T19:26:41,945][DEBUG][logstash.agent ] Converging pipelines [2017-12-11T19:26:41,945][DEBUG][logstash.agent ] Needed actions to converge {:actions_count=>1} [2017-12-11T19:26:41,945][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Stop/pipeline_id:main} [2017-12-11T19:26:41,969][DEBUG][logstash.pipeline ] Closing inputs {:pipeline_id=>"main", :thread=>"#<Thread:0x22eaaf58@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"} [2017-12-11T19:26:41,970][DEBUG][logstash.inputs.logstashinputazureblob] stopping {:plugin=>"LogStash::Inputs::LogstashInputAzureblob"} [2017-12-11T19:26:43,284][DEBUG][logstash.pipeline ] Closed inputs {:pipeline_id=>"main", :thread=>"#<Thread:0x22eaaf58@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"} [2017-12-11T19:26:43,285][DEBUG][logstash.pipeline ] Closing inputs {:pipeline_id=>"main", :thread=>"#<Thread:0x22eaaf58@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"}
[2017-12-11T19:24:23,979][DEBUG][logstash.inputs.logstashinputazureblob] registry_item offset: 21710 [2017-12-11T19:24:23,979][DEBUG][logstash.inputs.logstashinputazureblob] candidate_blob: DATAU-IMS-WS/2017/11/22/14/45d9c4.log content length: 22420 [2017-12-11T19:24:23,979][DEBUG][logstash.inputs.logstashinputazureblob] registry_item offset: 22420 [2017-12-11T19:24:23,979][DEBUG][logstash.inputs.logstashinputazureblob] candidate_blob: DATAU-IMS-WS/2017/11/22/14/5592d4.log content length: 13520 [2017-12-11T19:24:23,979][DEBUG][logstash.inputs.logstashinputazureblob] registry_item offset: 13520 [2017-12-11T19:24:23,979][DEBUG][logstash.inputs.logstashinputazureblob] candidate_blob: DATAU-IMS-WS/2017/11/22/14/673e5d.log content length: 19785 [2017-12-11T19:24:23,979][DEBUG][logstash.inputs.logstashinputazureblob] registry_item offset: 19785 [2017-12-11T19:24:23,980][DEBUG][logstash.inputs.logstashinputazureblob] candidate_blob: DATAU-IMS-WS/2017/11/22/14/697e0e.log content length: 18294 [2017-12-11T19:24:23,980][DEBUG][logstash.inputs.logstashinputazureblob] registry_item offset: 18294 [2017-12-11T19:24:23,980][DEBUG][logstash.inputs.logstashinputazureblob] candidate_blob: DATAU-IMS-WS/2017/11/22/14/775bed.log content length: 22227 [2017-12-11T19:24:23,980][DEBUG][logstash.inputs.logstashinputazureblob] registry_item offset: 22227 [2017-12-11T19:24:23,980][DEBUG][logstash.inputs.logstashinputazureblob] candidate_blob: DATAU-IMS-WS/2017/11/22/14/b297fa.log content length: 21284 [2017-12-11T19:24:23,980][DEBUG][logstash.inputs.logstashinputazureblob] registry_item offset: 21284 [2017-12-11T19:24:23,980][DEBUG][logstash.inputs.logstashinputazureblob] candidate_blob: DATAU-IMS-WS/2017/11/22/14/c259d3.log content length: 25101 [2017-12-11T19:24:23,980][DEBUG][logstash.inputs.logstashinputazureblob] registry_item offset: 25101 [2017-12-11T19:24:23,980][DEBUG][logstash.inputs.logstashinputazureblob] candidate_blob: DATAU-IMS-WS/2017/11/22/14/ff2518.log content length: 17539 [2017-12-11T19:24:23,980][DEBUG][logstash.inputs.logstashinputazureblob] registry_item offset: 17539 [2017-12-11T19:24:23,981][DEBUG][logstash.inputs.logstashinputazureblob] candidate_blob: ff2518.log content length: 17539 [2017-12-11T19:24:23,981][DEBUG][logstash.inputs.logstashinputazureblob] registry_item offset: 17539 [2017-12-11T19:24:24,403][DEBUG][logstash.inputs.logstashinputazureblob] Hitting interval of 30ms . . . [2017-12-11T19:24:28,124][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x23e44671@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"} [2017-12-11T19:24:33,125][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x23e44671@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"} [2017-12-11T19:24:38,125][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x23e44671@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"} [2017-12-11T19:24:43,126][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x23e44671@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"} [2017-12-11T19:24:48,127][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x23e44671@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"} [2017-12-11T19:24:53,127][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x23e44671@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"} [2017-12-11T19:24:58,229][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x23e44671@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"} [2017-12-11T19:25:03,229][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x23e44671@/home/jciazdeploy/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:290 sleep>"}
But I dont see anything in the file