Open papiveron opened 7 years ago
Thank you. Your example config helped me configure the prefix path. The doc is lacking in this regard.
I have the same problem.
I think you should config Access Key and Secret Key.
Same Problem
hi i am using s3 plugin for pushing elb logs to kibana. this is my input and outputconf.:- input { s3 { bucket => "bucketname/logs" type => "elb" prefix => "/AWSLogs/2710822xxc691/elasticloadbalancing/us-east-1/2019/07/29/" region => "us-east-1" codec => plain } }
output{ elasticsearch { hosts => ["http://10..30.6.119:9200","http://10.20.6.177:9200"] index => "%{[@metadata][indexname]}-%{+YYYY.MM.dd}" } } i am using elk 6.4 version .
[]=' for nil:NilClass", :backtrace=>["/opt/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/interval.rb:43:in
stop!'"please suggest me any mistake which can be corrected as i am not able to get the output
Hi, I have issue with my logstash s3 iinput. The last messages I see in my kibana iinterface is from several days earlier In fact I have an AWS elb with logs enable. I've tested from command line and I can see that logsstash is continuously processing inputs, and never outputs. In the elb s3 bucket there is one folder per day/per month/per year and each folder contains several log files and with a total size of arround 60GB.
It was working fine at the begining, but as logs grow, it become slow, and now I'm seeing my logs in the outpiut size. Logstah is keeping doing input task, filter, and never output logs.
I created a dedicated configuration file for test with only s3 as input, and test in a dedicated machine from command line :
the s3.conf file :
And I can see input processing, filter, and the message like "will start output worker....." but not output event received, never.
I created a new folder (named test_elb) on the bucket, and copy logs from one day folder (31/12/2016 for example) into it, and then set the new created as prefix in my input configuration like this :
And with that s3 prefix, logstash is doing all the pipeline processing (input, filter, output) as expecting, and I see my logs outputs. so for me it seems like the bucket is to large and losgstash-s3 plugin has difficult to process it. Can someone here advise on that problematic please?
My logstash version : 2.2.4 Operating system: Debian Jessie
I've search and ask in the discuss.elastic forum, in the elasticseach IRC chan, and no real solution. Do you thing it could be a bucket size matter
Thanks for the help.
Regards.