spujadas / elk-docker

Elasticsearch, Logstash, Kibana (ELK) Docker image
Other
2.16k stars 908 forks source link

sebp/elk failing to start up properly #325

Closed dannystaple closed 4 years ago

dannystaple commented 4 years ago

Running on a mac host. I have tried tags :latest and :760. I can see the elastic start, and to begin with, I am able to reach it on port 9200.

However, Kibana isn't up.

I wait a few minutes and then try to hit Kibana, to get the "Kibana isn't ready" message. The logs now say "No living connection", and I am unable to reach Elastic on port 9200.

spujadas commented 4 years ago

Usual disclaimer: I don't have access to a Mac, so if it turns out it's Mac-specific, I won't be able to help.

Could you share your start-up logs? Do you have enough RAM assigned to the container? (Not having enough RAM can cause Elasticsearch to die silently.)

dannystaple commented 4 years ago

Here are the logs:

* Starting periodic command scheduler cron        
[ OK ]
 * Starting Elasticsearch Server        future versions of Elasticsearch will require Java 11; your Java version from [/usr/lib/jvm/java-8-openjdk-amd64/jre] does not meet this requirement

[ OK ]
waiting for Elasticsearch to be up (1/30)
waiting for Elasticsearch to be up (2/30)
waiting for Elasticsearch to be up (3/30)
waiting for Elasticsearch to be up (4/30)
waiting for Elasticsearch to be up (5/30)
waiting for Elasticsearch to be up (6/30)
waiting for Elasticsearch to be up (7/30)
waiting for Elasticsearch to be up (8/30)
waiting for Elasticsearch to be up (9/30)
Waiting for Elasticsearch cluster to respond (1/30)
logstash started.
 * Starting Kibana5        
[ OK ]
==> /var/log/elasticsearch/elasticsearch.log <==
[2020-05-14T19:20:41,657][INFO ][o.e.c.m.MetaDataIndexTemplateService] [elk] adding template [.monitoring-logstash] for index patterns [.monitoring-logstash-7-*]
[2020-05-14T19:20:41,702][INFO ][o.e.c.m.MetaDataIndexTemplateService] [elk] adding template [.monitoring-es] for index patterns [.monitoring-es-7-*]
[2020-05-14T19:20:41,746][INFO ][o.e.c.m.MetaDataIndexTemplateService] [elk] adding template [.monitoring-beats] for index patterns [.monitoring-beats-7-*]
[2020-05-14T19:20:41,782][INFO ][o.e.c.m.MetaDataIndexTemplateService] [elk] adding template [.monitoring-alerts-7] for index patterns [.monitoring-alerts-7]
[2020-05-14T19:20:41,822][INFO ][o.e.c.m.MetaDataIndexTemplateService] [elk] adding template [.monitoring-kibana] for index patterns [.monitoring-kibana-7-*]
[2020-05-14T19:20:41,855][INFO ][o.e.x.i.a.TransportPutLifecycleAction] [elk] adding index lifecycle policy [watch-history-ilm-policy]
[2020-05-14T19:20:41,893][INFO ][o.e.x.i.a.TransportPutLifecycleAction] [elk] adding index lifecycle policy [ilm-history-ilm-policy]
[2020-05-14T19:20:41,925][INFO ][o.e.x.i.a.TransportPutLifecycleAction] [elk] adding index lifecycle policy [slm-history-ilm-policy]
[2020-05-14T19:20:42,082][INFO ][o.e.l.LicenseService     ] [elk] license [12b5f9e4-4cf3-4c5a-bbba-5454f7717184] mode [basic] - valid
[2020-05-14T19:20:42,083][INFO ][o.e.x.s.s.SecurityStatusChangeListener] [elk] Active license is now [BASIC]; Security is disabled

==> /var/log/logstash/logstash-plain.log <==

==> /var/log/kibana/kibana5.log <==
{"type":"log","@timestamp":"2020-05-14T19:20:53Z","tags":["info","plugins-service"],"pid":287,"message":"Plugin \"case\" is disabled."}

==> /var/log/logstash/logstash-plain.log <==
[2020-05-14T19:21:05,324][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/opt/logstash/data/queue"}
[2020-05-14T19:21:05,498][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/opt/logstash/data/dead_letter_queue"}
[2020-05-14T19:21:06,227][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.6.0"}
[2020-05-14T19:21:06,269][INFO ][logstash.agent           ] No persistent UUID file found. Generating new UUID {:uuid=>"43e439d8-638e-4f80-a622-5187a89f241e", :path=>"/opt/logstash/data/uuid"}
[2020-05-14T19:21:10,095][INFO ][org.reflections.Reflections] Reflections took 70 ms to scan 1 urls, producing 20 keys and 40 values 
[2020-05-14T19:21:11,808][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2020-05-14T19:21:12,362][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2020-05-14T19:21:12,472][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
[2020-05-14T19:21:12,483][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2020-05-14T19:21:12,971][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2020-05-14T19:21:13,500][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][main] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been create for key: cluster_uuids. This may result in invalid serialization.  It is recommended to log an issue to the responsible developer/development team.
[2020-05-14T19:21:13,514][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>6, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>750, "pipeline.sources"=>["/etc/logstash/conf.d/02-beats-input.conf", "/etc/logstash/conf.d/10-syslog.conf", "/etc/logstash/conf.d/11-nginx.conf", "/etc/logstash/conf.d/30-output.conf"], :thread=>"#<Thread:0x29830951 run>"}
[2020-05-14T19:21:18,044][INFO ][logstash.inputs.beats    ][main] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2020-05-14T19:21:18,158][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2020-05-14T19:21:18,544][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2020-05-14T19:21:18,578][INFO ][org.logstash.beats.Server][main] Starting server on port: 5044
[2020-05-14T19:21:22,438][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

==> /var/log/elasticsearch/elasticsearch.log <==
[2020-05-14T19:22:21,553][INFO ][o.e.x.m.p.NativeController] [elk] Native controller process has stopped - no new native processes can be started

==> /var/log/kibana/kibana5.log <==
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins-system"],"pid":287,"message":"Setting up [37] plugins: [timelion,features,usageCollection,metrics,canvas,apm_oss,taskManager,siem,licensing,security,infra,encryptedSavedObjects,code,uiActions,data,navigation,status_page,share,newsfeed,kibana_legacy,management,dev_tools,eui_utils,inspector,expressions,visualizations,embeddable,dashboard_embeddable_container,advancedUiActions,home,spaces,apm,cloud,graph,bfetch,translations,reporting]"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","timelion"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","features"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","usageCollection"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","metrics"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","canvas"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","apm_oss"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","taskManager"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","siem"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","licensing"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","security"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["warning","plugins","security","config"],"pid":287,"message":"Generating a random key for xpack.security.encryptionKey. To prevent sessions from being invalidated on restart, please set xpack.security.encryptionKey in kibana.yml"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["warning","plugins","security","config"],"pid":287,"message":"Session cookies will be transmitted over insecure connections. This is not recommended."}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","infra"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","encryptedSavedObjects"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["warning","plugins","encryptedSavedObjects","config"],"pid":287,"message":"Generating a random key for xpack.encryptedSavedObjects.encryptionKey. To be able to decrypt encrypted saved objects attributes after restart, please set xpack.encryptedSavedObjects.encryptionKey in kibana.yml"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","code"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","data"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","share"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","home"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","spaces"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","apm"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["error","elasticsearch","data"],"pid":287,"message":"Request error, retrying\nHEAD http://localhost:9200/.apm-agent-configuration => connect ECONNREFUSED 127.0.0.1:9200"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["error","elasticsearch","admin"],"pid":287,"message":"Request error, retrying\nGET http://localhost:9200/_nodes?filter_path=nodes.*.version%2Cnodes.*.http.publish_address%2Cnodes.*.ip => connect ECONNREFUSED 127.0.0.1:9200"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["error","elasticsearch","data"],"pid":287,"message":"Request error, retrying\nGET http://localhost:9200/_xpack => connect ECONNREFUSED 127.0.0.1:9200"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["warning","elasticsearch","data"],"pid":287,"message":"Unable to revive connection: http://localhost:9200/"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["warning","elasticsearch","data"],"pid":287,"message":"No living connections"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","cloud"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","graph"],"pid":287,"message":"Setting up plugin"}
{"type":"log","@timestamp":"2020-05-14T19:22:48Z","tags":["info","plugins","bfetch"],"pid":287,"message":"Setting up plugin"}

However, I'm going to adjust the memory - it was at a default 2Gb - I realise I may have missed the prerequisites part of the guide. Let me retry with 4Gb.

dannystaple commented 4 years ago

Ok closing - memory was it.