deviantony / docker-elk

The Elastic stack (ELK) powered by Docker and Compose.
MIT License
17.13k stars 6.75k forks source link

Cannot log in to Kibana: at least one primary shard for the index [.security-7] is unavailable #829

Closed tomofu74 closed 1 year ago

tomofu74 commented 1 year ago

Problem description

I can't login to kibana. When I access to http://localhost:5601 after several minutes from docker comose up, it shows "kibana server is not ready yet."

Extra information

I just clone main branch and change elasticsearch/config/elasticsearch.yml. xpack.license.self_generated.type: basic.

Stack configuration

Docker setup

Ubuntu 22.04.1

$ docker version

Docker version 20.10.5, build 55c4c88
$ docker-compose version

Docker Compose version v2.16.0

Container logs

$ docker-compose logs

docker-elk-elasticsearch-1  | {"@timestamp":"2023-02-19T15:29:32.608Z", "log.level": "INFO", "message":"Authentication of [kibana_system] was terminated by realm [reserved] - failed to authenticate user [kibana_system]", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"elasticsearch[elasticsearch][transport_worker][T#3]","log.logger":"org.elasticsearch.xpack.security.authc.RealmsAuthenticator","trace.id":"2126dceede8dadd53a59081dfb13575a","elasticsearch.cluster.uuid":"uc9hsY7jTtyORGuvLSvU9A","elasticsearch.node.id":"66hGAvE-SqSjDZvOCd7lkw","elasticsearch.node.name":"elasticsearch","elasticsearch.cluster.name":"docker-cluster"}
docker-elk-elasticsearch-1  | {"@timestamp":"2023-02-19T15:29:33.217Z", "log.level":"ERROR", "message":"failed to retrieve password hash for reserved user [kibana_system]", "ecs.version": "1.2.0","service.name":"ES_ECS","event.dataset":"elasticsearch.server","process.thread.name":"elasticsearch[elasticsearch][transport_worker][T#3]","log.logger":"org.elasticsearch.xpack.security.authc.esnative.ReservedRealm","trace.id":"2126dceede8dadd53a59081dfb13575a","elasticsearch.cluster.uuid":"uc9hsY7jTtyORGuvLSvU9A","elasticsearch.node.id":"66hGAvE-SqSjDZvOCd7lkw","elasticsearch.node.name":"elasticsearch","elasticsearch.cluster.name":"docker-cluster","error.type":"org.elasticsearch.action.UnavailableShardsException","error.message":"at least one primary shard for the index [.security-7] is unavailable","error.stack_trace":"org.elasticsearch.action.UnavailableShardsException: at least one primary shard for the index [.security-7] is unavailable\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.support.SecurityIndexManager.getUnavailableReason(SecurityIndexManager.java:138)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.esnative.NativeUsersStore.getReservedUserInfo(NativeUsersStore.java:603)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.esnative.ReservedRealm.getUserInfo(ReservedRealm.java:275)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.esnative.ReservedRealm.doAuthenticate(ReservedRealm.java:135)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.support.CachingUsernamePasswordRealm.authenticateWithCache(CachingUsernamePasswordRealm.java:200)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.support.CachingUsernamePasswordRealm.authenticate(CachingUsernamePasswordRealm.java:105)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.RealmsAuthenticator.lambda$consumeToken$4(RealmsAuthenticator.java:147)\n\tat org.elasticsearch.xcore@8.6.2/org.elasticsearch.xpack.core.common.IteratingActionListener.run(IteratingActionListener.java:117)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.RealmsAuthenticator.consumeToken(RealmsAuthenticator.java:234)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.RealmsAuthenticator.authenticate(RealmsAuthenticator.java:83)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.AuthenticatorChain.lambda$getAuthenticatorConsumer$5(AuthenticatorChain.java:180)\n\tat org.elasticsearch.xcore@8.6.2/org.elasticsearch.xpack.core.common.IteratingActionListener.onResponse(IteratingActionListener.java:135)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.AuthenticatorChain.lambda$getAuthenticatorConsumer$5(AuthenticatorChain.java:158)\n\tat org.elasticsearch.xcore@8.6.2/org.elasticsearch.xpack.core.common.IteratingActionListener.onResponse(IteratingActionListener.java:135)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.AuthenticatorChain.lambda$getAuthenticatorConsumer$5(AuthenticatorChain.java:158)\n\tat org.elasticsearch.xcore@8.6.2/org.elasticsearch.xpack.core.common.IteratingActionListener.onResponse(IteratingActionListener.java:135)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.AuthenticatorChain.lambda$getAuthenticatorConsumer$5(AuthenticatorChain.java:158)\n\tat org.elasticsearch.xcore@8.6.2/org.elasticsearch.xpack.core.common.IteratingActionListener.run(IteratingActionListener.java:117)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.AuthenticatorChain.doAuthenticate(AuthenticatorChain.java:136)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.AuthenticatorChain.authenticateAsync(AuthenticatorChain.java:95)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.AuthenticationService.authenticate(AuthenticationService.java:149)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.authc.AuthenticationService.authenticate(AuthenticationService.java:127)\n\tat org.elasticsearch.security@8.6.2/org.elasticsearch.xpack.security.rest.SecurityRestFilter.handleRequest(SecurityRestFilter.java:101)\n\tat org.elasticsearch.server@8.6.2/org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:398)\n\tat org.elasticsearch.server@8.6.2/org.elasticsearch.rest.RestController.tryAllHandlers(RestController.java:532)\n\tat org.elasticsearch.server@8.6.2/org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:313)\n\tat org.elasticsearch.server@8.6.2/org.elasticsearch.http.AbstractHttpServerTransport.dispatchRequest(AbstractHttpServerTransport.java:379)\n\tat org.elasticsearch.server@8.6.2/org.elasticsearch.http.AbstractHttpServerTransport.handleIncomingRequest(AbstractHttpServerTransport.java:460)\n\tat org.elasticsearch.server@8.6.2/org.elasticsearch.http.AbstractHttpServerTransport.incomingRequest(AbstractHttpServerTransport.java:353)\n\tat org.elasticsearch.transport.netty4@8.6.2/org.elasticsearch.http.netty4.Netty4HttpPipeliningHandler.handlePipelinedRequest(Netty4HttpPipeliningHandler.java:128)\n\tat org.elasticsearch.transport.netty4@8.6.2/org.elasticsearch.http.netty4.Netty4HttpPipeliningHandler.channelRead(Netty4HttpPipeliningHandler.java:118)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)\n\tat io.netty.codec@4.1.84.Final/io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)\n\tat io.netty.codec@4.1.84.Final/io.netty.handler.codec.MessageToMessageCodec.channelRead(MessageToMessageCodec.java:111)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)\n\tat io.netty.codec@4.1.84.Final/io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)\n\tat io.netty.codec@4.1.84.Final/io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)\n\tat io.netty.codec@4.1.84.Final/io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:336)\n\tat io.netty.codec@4.1.84.Final/io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:308)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)\n\tat io.netty.codec@4.1.84.Final/io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.nio.NioEventLoop.processSelectedKeysPlain(NioEventLoop.java:689)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:652)\n\tat io.netty.transport@4.1.84.Final/io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)\n\tat io.netty.common@4.1.84.Final/io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)\n\tat io.netty.common@4.1.84.Final/io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)\n\tat java.base/java.lang.Thread.run(Thread.java:1589)\n"}
antoineco commented 1 year ago

Please attach the logs as requested. I can't even see whether the setup completed successfully.

antoineco commented 1 year ago

Probably a duplicate of https://github.com/deviantony/docker-elk/issues/795#issuecomment-1331918093

tomofu74 commented 1 year ago

Hi, antoineco. Thank you for your comment.

log: log.txt

BTW it has 32GB free disk space, it is enough isn't it?

antoineco commented 1 year ago

Thanks for sharing!

32 GB should be enough for quite a lot of data yes, but Elasticsearch doesn't calculate the flood-stage disk watermark based on absolute numbers, rather in terms of percentage. This is visible in the following log entry (prettified with jq for readability):

{
  "@timestamp": "2023-02-19T16:04:12.722Z",
  "log.level": "WARN",
  "message": "flood stage disk watermark [95%] exceeded on [66hGAvE-SqSjDZvOCd7lkw][elasticsearch][/usr/share/elasticsearch/data] free: 33.5gb[3.8%], all indices on this node will be marked read-only",
  "ecs.version": "1.2.0",
  "service.name": "ES_ECS",
  "event.dataset": "elasticsearch.server",
  "process.thread.name": "elasticsearch[elasticsearch][management][T#2]",
  "log.logger": "org.elasticsearch.cluster.routing.allocation.DiskThresholdMonitor",
  "elasticsearch.node.name": "elasticsearch",
  "elasticsearch.cluster.name": "docker-cluster"
}

33.5 GB represents 3.8% of your entire disk, which is below the flood-stage watermark (5%, or rather 95% of disk usage).

The solution, as I suggested above, is to adjust the settings based on your disk size. There is another watermark called "high" at 90% of disk usage, which you'll also need to adjust.

tomofu74 commented 1 year ago

Thank you, antoineco! Finaly I remove some files and get disk space increase, I can acess kibana login screen.