k3s-io / k3s

Lightweight Kubernetes
https://k3s.io
Apache License 2.0
28.08k stars 2.35k forks source link

kubectl get pods -> Error from server (ServiceUnavailable): the server is currently unable to handle the request #3723

Closed pandarun closed 3 years ago

pandarun commented 3 years ago

Environmental Info: K3s Version:

k3s version v1.20.7+k3s1 (aa768cbd) go version go1.15.12

Node(s) CPU architecture, OS, and Version:

cat /proc/cpuinfo

processor   : 0
vendor_id   : AuthenticAMD
cpu family  : 23
model       : 49
model name  : AMD EPYC Processor
stepping    : 0
microcode   : 0x1000065
cpu MHz     : 2495.312
cache size  : 512 KB
physical id : 0
siblings    : 3
core id     : 0
cpu cores   : 3
apicid      : 0
initial apicid  : 0
fpu     : yes
fpu_exception   : yes
cpuid level : 13
wp      : yes
flags       : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr_opt pdpe1gb rdtscp lm rep_good nopl cpuid extd_apicid tsc_known_freq pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 x2apic movbe popcnt aes xsave avx f16c rdrand hypervisor lahf_lm cmp_legacy cr8_legacy abm sse4a misalignsse 3dnowprefetch osvw topoext perfctr_core ssbd ibrs ibpb stibp vmmcall fsgsbase bmi1 avx2 smep bmi2 rdseed adx smap clflushopt clwb sha_ni xsaveopt xsavec xgetbv1 xsaves clzero xsaveerptr wbnoinvd arat umip rdpid arch_capabilities
bugs        : sysret_ss_attrs spectre_v1 spectre_v2 spec_store_bypass
bogomips    : 4990.62
TLB size    : 1024 4K pages
clflush size    : 64
cache_alignment : 64
address sizes   : 40 bits physical, 48 bits virtual
power management:

4GB ram Release: 20.04

Linux k3s-management-1 5.4.0-72-generic #80-Ubuntu SMP Mon Apr 12 17:35:00 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux

Cluster Configuration:

2 servers, 2 agents

Describe the bug:

after removing 1 server and 1 worker node, cluster became unresponsive

root@k3s-management-1:~# kubectl get pods
Error from server (ServiceUnavailable): the server is currently unable to handle the request
root@k3s-management-1:~# kubectl get nodes
Error from server (ServiceUnavailable): the server is currently unable to handle the request

Steps To Reproduce:

Expected behavior:

cluster downscaled to 1 server 1 worker, all pods that are > resource limits should'nt be scheduled (restart backof)

Actual behavior:

no way to get pod/node info

Additional context / logs:


Jul 28 20:16:10 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:10.662+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUN>
Jul 28 20:16:10 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:10.662+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUN>
Jul 28 20:16:11 k3s-management-1 k3s[540]: time="2021-07-28T20:16:11.516973602+02:00" level=info msg="Cluster-Http-Server 2021/07/28 20:16:11 http: TLS handshake error from 172.16.0.10:58140: remote erro>
Jul 28 20:16:12 k3s-management-1 k3s[540]: time="2021-07-28T20:16:12.057703115+02:00" level=info msg="Cluster-Http-Server 2021/07/28 20:16:12 http: TLS handshake error from 172.16.0.51:58556: remote erro>
Jul 28 20:16:14 k3s-management-1 k3s[540]: time="2021-07-28T20:16:14.067906938+02:00" level=info msg="Cluster-Http-Server 2021/07/28 20:16:14 http: TLS handshake error from 172.16.0.51:58564: remote erro>
Jul 28 20:16:14 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:14.315+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:/>
Jul 28 20:16:14 k3s-management-1 k3s[540]: time="2021-07-28T20:16:14.315744132+02:00" level=error msg="Failed to check local etcd status for learner management: context deadline exceeded"
Jul 28 20:16:14 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:14.315+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0>
Jul 28 20:16:15 k3s-management-1 k3s[540]: {"level":"info","ts":"2021-07-28T20:16:15.606+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 2810"}
Jul 28 20:16:15 k3s-management-1 k3s[540]: {"level":"info","ts":"2021-07-28T20:16:15.606+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 2811"}
Jul 28 20:16:15 k3s-management-1 k3s[540]: {"level":"info","ts":"2021-07-28T20:16:15.606+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 2811"}
Jul 28 20:16:15 k3s-management-1 k3s[540]: {"level":"info","ts":"2021-07-28T20:16:15.607+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5>
Jul 28 20:16:15 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:15.653+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:/>
Jul 28 20:16:15 k3s-management-1 k3s[540]: time="2021-07-28T20:16:15.653149011+02:00" level=info msg="Failed to test data store connection: context deadline exceeded"
Jul 28 20:16:15 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:15.653+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0>
Jul 28 20:16:15 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:15.663+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUN>
Jul 28 20:16:15 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:15.663+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUN>
Jul 28 20:16:16 k3s-management-1 k3s[540]: time="2021-07-28T20:16:16.076136043+02:00" level=info msg="Cluster-Http-Server 2021/07/28 20:16:16 http: TLS handshake error from 172.16.0.51:58572: remote erro>
Jul 28 20:16:17 k3s-management-1 k3s[540]: time="2021-07-28T20:16:17.014938283+02:00" level=info msg="Cluster-Http-Server 2021/07/28 20:16:17 http: TLS handshake error from 172.16.0.10:58144: remote erro>
Jul 28 20:16:18 k3s-management-1 k3s[540]: time="2021-07-28T20:16:18.084583433+02:00" level=info msg="Cluster-Http-Server 2021/07/28 20:16:18 http: TLS handshake error from 172.16.0.51:58580: remote erro>
Jul 28 20:16:20 k3s-management-1 k3s[540]: time="2021-07-28T20:16:20.093504815+02:00" level=info msg="Cluster-Http-Server 2021/07/28 20:16:20 http: TLS handshake error from 172.16.0.51:58588: remote erro>
Jul 28 20:16:20 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:20.638+0200","caller":"etcdserver/server.go:2068","msg":"failed to publish local member to cluster through raft","local-m>
Jul 28 20:16:20 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:20.663+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUN>
Jul 28 20:16:20 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:20.663+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUN>
Jul 28 20:16:22 k3s-management-1 k3s[540]: time="2021-07-28T20:16:22.103212601+02:00" level=info msg="Cluster-Http-Server 2021/07/28 20:16:22 http: TLS handshake error from 172.16.0.51:58596: remote erro>
Jul 28 20:16:22 k3s-management-1 k3s[540]: {"level":"info","ts":"2021-07-28T20:16:22.106+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 2811"}
Jul 28 20:16:22 k3s-management-1 k3s[540]: {"level":"info","ts":"2021-07-28T20:16:22.106+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 2812"}
Jul 28 20:16:22 k3s-management-1 k3s[540]: {"level":"info","ts":"2021-07-28T20:16:22.106+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 2812"}
Jul 28 20:16:22 k3s-management-1 k3s[540]: {"level":"info","ts":"2021-07-28T20:16:22.106+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5>
Jul 28 20:16:22 k3s-management-1 k3s[540]: time="2021-07-28T20:16:22.529027437+02:00" level=info msg="Cluster-Http-Server 2021/07/28 20:16:22 http: TLS handshake error from 172.16.0.10:58148: remote erro>
Jul 28 20:16:24 k3s-management-1 k3s[540]: time="2021-07-28T20:16:24.112216357+02:00" level=info msg="Cluster-Http-Server 2021/07/28 20:16:24 http: TLS handshake error from 172.16.0.51:58604: remote erro>
Jul 28 20:16:25 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:25.663+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUN>
Jul 28 20:16:25 k3s-management-1 k3s[540]: {"level":"warn","ts":"2021-07-28T20:16:25.663+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUN

Backporting

brandond commented 3 years ago

Etcd should only be deployed with an odd number of nodes. A two-node etcd cluster has no quorum tolerance - loss of any single node will cause an outage, which is where you find yourself at the moment.

You can reset your etcd cluster to a single node by stopping K3s, running k3s server --cluster-reset, and then starting the service again.

pandarun commented 3 years ago

@brandond thank you for quick response!

I've added two more nodes as you suggested:

Welcome to Ubuntu 20.04.2 LTS (GNU/Linux 5.4.0-72-generic x86_64)

 * Documentation:  https://help.ubuntu.com
 * Management:     https://landscape.canonical.com
 * Support:        https://ubuntu.com/advantage
Last login: Wed Jul 28 20:08:43 2021 from 8.21.110.25
root@k3s-management-1:~# kubectl get pods
Error from server (ServiceUnavailable): the server is currently unable to handle the request
root@k3s-management-1:~# service k3s status
● k3s.service - Lightweight Kubernetes
     Loaded: loaded (/etc/systemd/system/k3s.service; enabled; vendor preset: enabled)
     Active: activating (start) since Wed 2021-07-28 23:06:53 CEST; 1min 28s ago
       Docs: https://k3s.io
    Process: 495 ExecStartPre=/sbin/modprobe br_netfilter (code=exited, status=0/SUCCESS)
    Process: 533 ExecStartPre=/sbin/modprobe overlay (code=exited, status=0/SUCCESS)
   Main PID: 539 (k3s-server)
      Tasks: 9
     Memory: 612.3M
     CGroup: /system.slice/k3s.service
             └─539 /usr/local/bin/k3s server

Jul 28 23:08:21 k3s-management-1 k3s[539]: {"level":"info","ts":"2021-07-28T23:08:21.530+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4217"}
Jul 28 23:08:21 k3s-management-1 k3s[539]: {"level":"info","ts":"2021-07-28T23:08:21.530+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4218"}
Jul 28 23:08:21 k3s-management-1 k3s[539]: {"level":"info","ts":"2021-07-28T23:08:21.530+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4218"}
Jul 28 23:08:21 k3s-management-1 k3s[539]: {"level":"info","ts":"2021-07-28T23:08:21.530+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5>
Jul 28 23:08:22 k3s-management-1 k3s[539]: {"level":"warn","ts":"2021-07-28T23:08:22.073+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUN>
Jul 28 23:08:22 k3s-management-1 k3s[539]: {"level":"warn","ts":"2021-07-28T23:08:22.073+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUN>
Jul 28 23:08:22 k3s-management-1 k3s[539]: {"level":"warn","ts":"2021-07-28T23:08:22.079+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:/>
Jul 28 23:08:22 k3s-management-1 k3s[539]: time="2021-07-28T23:08:22.080029839+02:00" level=info msg="Failed to test data store connection: context deadline exceeded"
Jul 28 23:08:22 k3s-management-1 k3s[539]: {"level":"warn","ts":"2021-07-28T23:08:22.080+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0>
Jul 28 23:08:22 k3s-management-1 k3s[539]: time="2021-07-28T23:08:22.190440368+02:00" level=info msg="Cluster-Http-Server 2021/07/28 23:08:22 http: TLS handshake error from 172.16.0.51:42764: remote erro>

Welcome to Ubuntu 20.04.2 LTS (GNU/Linux 5.4.0-77-generic x86_64)

 * Documentation:  https://help.ubuntu.com
 * Management:     https://landscape.canonical.com
 * Support:        https://ubuntu.com/advantage
root@k3s-management-2:~# service k3s status
● k3s.service - Lightweight Kubernetes
     Loaded: loaded (/etc/systemd/system/k3s.service; enabled; vendor preset: enabled)
     Active: activating (auto-restart) (Result: exit-code) since Wed 2021-07-28 23:09:39 CEST; 4s ago
       Docs: https://k3s.io
    Process: 167094 ExecStartPre=/bin/sh -xc ! /usr/bin/systemctl is-enabled --quiet nm-cloud-setup.service (code=exited, status=0/SUCCESS)
    Process: 167103 ExecStartPre=/sbin/modprobe br_netfilter (code=exited, status=0/SUCCESS)
    Process: 167104 ExecStartPre=/sbin/modprobe overlay (code=exited, status=0/SUCCESS)
    Process: 167105 ExecStart=/usr/local/bin/k3s server --token=EYJIWvtPSoP4q7cRpgSF1n0NadZ6YDGp5Ci0FrmsZdesPilEyieQ02ZmXpIu2g9tyRW1zkZcvu5o5SaE9XjQ/t/Ofdqnqpo/wLS7YMrfHfT1mjo1gAaRNA4UatyKMPd9Qk3gYItQnhO>
   Main PID: 167105 (code=exited, status=1/FAILURE)
root@k3s-management-2:~# kubectl get pods
The connection to the server localhost:8080 was refused - did you specify the right host or port?

Jul 28 23:12:46 k3s-management-2 systemd[1]: Failed to start Lightweight Kubernetes.
Jul 28 23:12:51 k3s-management-2 systemd[1]: k3s.service: Scheduled restart job, restart counter is at 5536.
Jul 28 23:12:51 k3s-management-2 systemd[1]: Stopped Lightweight Kubernetes.
Jul 28 23:12:51 k3s-management-2 systemd[1]: Starting Lightweight Kubernetes...
Jul 28 23:12:51 k3s-management-2 sh[168142]: + /usr/bin/systemctl is-enabled --quiet nm-cloud-setup.service
Jul 28 23:12:51 k3s-management-2 sh[168143]: Failed to get unit file state for nm-cloud-setup.service: No such file or directory
Jul 28 23:12:52 k3s-management-2 k3s[168146]: time="2021-07-28T23:12:52.286530057+02:00" level=info msg="Starting k3s v1.21.3+k3s1 (1d1f220f)"
Jul 28 23:12:52 k3s-management-2 k3s[168146]: time="2021-07-28T23:12:52.342163603+02:00" level=fatal msg="starting kubernetes: preparing server: failed to get CA certs: https://172.16.0.5:6443/cacerts: 5>
Jul 28 23:12:52 k3s-management-2 systemd[1]: k3s.service: Main process exited, code=exited, status=1/FAILURE
Jul 28 23:12:52 k3s-management-2 systemd[1]: k3s.service: Failed with result 'exit-code'.
Jul 28 23:12:52 k3s-management-2 systemd[1]: Failed to start Lightweight Kubernetes.

Welcome to Ubuntu 20.04.2 LTS (GNU/Linux 5.4.0-77-generic x86_64)

 * Documentation:  https://help.ubuntu.com
 * Management:     https://landscape.canonical.com
 * Support:        https://ubuntu.com/advantage

root@k3s-management-3:~# kubectl get pods
The connection to the server localhost:8080 was refused - did you specify the right host or port?
root@k3s-management-3:~# service k3s status
● k3s.service - Lightweight Kubernetes
     Loaded: loaded (/etc/systemd/system/k3s.service; enabled; vendor preset: enabled)
     Active: activating (auto-restart) (Result: exit-code) since Wed 2021-07-28 23:11:08 CEST; 2s ago
       Docs: https://k3s.io
    Process: 19637 ExecStartPre=/bin/sh -xc ! /usr/bin/systemctl is-enabled --quiet nm-cloud-setup.service (code=exited, status=0/SUCCESS)
    Process: 19639 ExecStartPre=/sbin/modprobe br_netfilter (code=exited, status=0/SUCCESS)
    Process: 19640 ExecStartPre=/sbin/modprobe overlay (code=exited, status=0/SUCCESS)
    Process: 19641 ExecStart=/usr/local/bin/k3s server --token=<REDACTED>>
   Main PID: 19641 (code=exited, status=1/FAILURE)

   Jul 28 23:12:08 k3s-management-3 systemd[1]: k3s.service: Scheduled restart job, restart counter is at 442.
Jul 28 23:12:08 k3s-management-3 systemd[1]: Stopped Lightweight Kubernetes.
Jul 28 23:12:08 k3s-management-3 systemd[1]: Starting Lightweight Kubernetes...
Jul 28 23:12:08 k3s-management-3 sh[19920]: + /usr/bin/systemctl is-enabled --quiet nm-cloud-setup.service
Jul 28 23:12:08 k3s-management-3 sh[19924]: Failed to get unit file state for nm-cloud-setup.service: No such file or directory
Jul 28 23:12:08 k3s-management-3 k3s[19937]: time="2021-07-28T23:12:08.538703259+02:00" level=info msg="Starting k3s v1.21.3+k3s1 (1d1f220f)"
Jul 28 23:12:08 k3s-management-3 k3s[19937]: time="2021-07-28T23:12:08.580882296+02:00" level=fatal msg="starting kubernetes: preparing server: failed to get CA certs: https://172.16.0.5:6443/cacerts: 50>
Jul 28 23:12:08 k3s-management-3 systemd[1]: k3s.service: Main process exited, code=exited, status=1/FAILURE
Jul 28 23:12:08 k3s-management-3 systemd[1]: k3s.service: Failed with result 'exit-code'.
Jul 28 23:12:08 k3s-management-3 systemd[1]: Failed to start Lightweight Kubernetes.

but it didn't help, seems like my main (k3s-management-1) node can't serve certificates for 2 additional server nodes.

brandond commented 3 years ago

Did you reset the cluster back to a single node first? You can't do anything until you fix the etcd cluster.

pandarun commented 3 years ago

@bradtopol i did,


root@k3s-management-1:~# service k3s stop
root@k3s-management-1:~# k3s server --cluster-reset
INFO[2021-07-28T23:39:43.185355808+02:00] Starting k3s v1.20.7+k3s1 (aa768cbd)         
INFO[2021-07-28T23:39:43.186155248+02:00] Managed etcd cluster bootstrap already complete and initialized 
{"level":"info","ts":"2021-07-28T23:39:43.232+0200","caller":"embed/etcd.go:117","msg":"configuring peer listeners","listen-peer-urls":["https://162.55.57.75:2380"]}
{"level":"info","ts":"2021-07-28T23:39:43.232+0200","caller":"embed/etcd.go:468","msg":"starting with peer TLS","tls-info":"cert = /var/lib/rancher/k3s/server/tls/etcd/peer-server-client.crt, key = /var/lib/rancher/k3s/server/tls/etcd/peer-server-client.key, trusted-ca = /var/lib/rancher/k3s/server/tls/etcd/peer-ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
{"level":"info","ts":"2021-07-28T23:39:43.233+0200","caller":"embed/etcd.go:127","msg":"configuring client listeners","listen-client-urls":["https://127.0.0.1:2379","https://162.55.57.75:2379"]}
{"level":"info","ts":"2021-07-28T23:39:43.234+0200","caller":"embed/etcd.go:302","msg":"starting an etcd server","etcd-version":"3.4.13","git-sha":"Not provided (use ./build instead of go build)","go-version":"go1.15.12","go-os":"linux","go-arch":"amd64","max-cpu-set":2,"max-cpu-available":2,"member-initialized":true,"name":"k3s-management-1-8d09628b","data-dir":"/var/lib/rancher/k3s/server/db/etcd","wal-dir":"","wal-dir-dedicated":"","member-dir":"/var/lib/rancher/k3s/server/db/etcd/member","force-new-cluster":false,"heartbeat-interval":"500ms","election-timeout":"5s","initial-election-tick-advance":true,"snapshot-count":100000,"snapshot-catchup-entries":5000,"initial-advertise-peer-urls":["http://localhost:2380"],"listen-peer-urls":["https://162.55.57.75:2380"],"advertise-client-urls":["https://162.55.57.75:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://162.55.57.75:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"],"cors":["*"],"host-whitelist":["*"],"initial-cluster":"","initial-cluster-state":"new","initial-cluster-token":"","quota-size-bytes":2147483648,"pre-vote":false,"initial-corrupt-check":false,"corrupt-check-time-interval":"0s","auto-compaction-mode":"","auto-compaction-retention":"0s","auto-compaction-interval":"0s","discovery-url":"","discovery-proxy":""}
{"level":"info","ts":"2021-07-28T23:39:43.250+0200","caller":"etcdserver/backend.go:80","msg":"opened backend db","path":"/var/lib/rancher/k3s/server/db/etcd/member/snap/db","took":"15.673613ms"}
INFO[2021-07-28T23:39:43.553243805+02:00] Cluster-Http-Server 2021/07/28 23:39:43 http: TLS handshake error from 172.16.0.51:50252: remote error: tls: bad certificate 
INFO[2021-07-28T23:39:43.803216186+02:00] Cluster-Http-Server 2021/07/28 23:39:43 http: TLS handshake error from 172.16.0.10:38732: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:39:44.190+0200","caller":"etcdserver/server.go:451","msg":"recovered v2 store from snapshot","snapshot-index":18700197,"snapshot-size":"18 kB"}
{"level":"info","ts":"2021-07-28T23:39:44.192+0200","caller":"mvcc/kvstore.go:380","msg":"restored last compact revision","meta-bucket-name":"meta","meta-bucket-name-key":"finishedCompactRev","restored-compact-revision":15207359}
{"level":"info","ts":"2021-07-28T23:39:44.203+0200","caller":"etcdserver/server.go:469","msg":"recovered v3 backend from snapshot","backend-size-bytes":23179264,"backend-size":"23 MB","backend-size-in-use-bytes":10567680,"backend-size-in-use":"11 MB"}
INFO[2021-07-28T23:39:44.296746853+02:00] Cluster-Http-Server 2021/07/28 23:39:44 http: TLS handshake error from 172.16.0.11:60876: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:39:44.475+0200","caller":"etcdserver/raft.go:536","msg":"restarting local member","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","commit-index":18782947}
{"level":"info","ts":"2021-07-28T23:39:44.479+0200","caller":"raft/raft.go:1530","msg":"a5ff6d2872667bd3 switched to configuration voters=(6610737761517497322 11961399155804765139 18331184626531864927)"}
{"level":"info","ts":"2021-07-28T23:39:44.479+0200","caller":"raft/raft.go:700","msg":"a5ff6d2872667bd3 became follower at term 4478"}
{"level":"info","ts":"2021-07-28T23:39:44.479+0200","caller":"raft/raft.go:383","msg":"newRaft a5ff6d2872667bd3 [peers: [5bbe0ef80a82ebea,a5ff6d2872667bd3,fe65734b8888915f], term: 4478, commit: 18782947, applied: 18700197, lastindex: 18782950, lastterm: 40]"}
{"level":"info","ts":"2021-07-28T23:39:44.480+0200","caller":"api/capability.go:76","msg":"enabled capabilities for version","cluster-version":"3.4"}
{"level":"info","ts":"2021-07-28T23:39:44.480+0200","caller":"membership/cluster.go:256","msg":"recovered/added member from store","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","recovered-remote-peer-id":"5bbe0ef80a82ebea","recovered-remote-peer-urls":["https://172.16.0.10:2380"]}
{"level":"info","ts":"2021-07-28T23:39:44.480+0200","caller":"membership/cluster.go:256","msg":"recovered/added member from store","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","recovered-remote-peer-id":"a5ff6d2872667bd3","recovered-remote-peer-urls":["https://172.16.0.5:2380"]}
{"level":"info","ts":"2021-07-28T23:39:44.480+0200","caller":"membership/cluster.go:256","msg":"recovered/added member from store","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","recovered-remote-peer-id":"fe65734b8888915f","recovered-remote-peer-urls":["https://172.16.0.11:2380"]}
{"level":"info","ts":"2021-07-28T23:39:44.480+0200","caller":"membership/cluster.go:269","msg":"set cluster version from store","cluster-version":"3.4"}
{"level":"warn","ts":"2021-07-28T23:39:44.481+0200","caller":"auth/store.go:1366","msg":"simple token is not cryptographically signed"}
{"level":"info","ts":"2021-07-28T23:39:44.482+0200","caller":"mvcc/kvstore.go:380","msg":"restored last compact revision","meta-bucket-name":"meta","meta-bucket-name-key":"finishedCompactRev","restored-compact-revision":15207359}
{"level":"info","ts":"2021-07-28T23:39:44.489+0200","caller":"etcdserver/quota.go:98","msg":"enabled backend quota with default value","quota-name":"v3-applier","quota-size-bytes":2147483648,"quota-size":"2.1 GB"}
{"level":"info","ts":"2021-07-28T23:39:44.490+0200","caller":"rafthttp/peer.go:128","msg":"starting remote peer","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-28T23:39:44.490+0200","caller":"rafthttp/pipeline.go:71","msg":"started HTTP pipelining with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-28T23:39:44.491+0200","caller":"rafthttp/stream.go:166","msg":"started stream writer with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-28T23:39:44.491+0200","caller":"rafthttp/stream.go:166","msg":"started stream writer with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-28T23:39:44.491+0200","caller":"rafthttp/peer.go:134","msg":"started remote peer","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-28T23:39:44.491+0200","caller":"rafthttp/transport.go:327","msg":"added remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea","remote-peer-urls":["https://172.16.0.10:2380"]}
{"level":"info","ts":"2021-07-28T23:39:44.491+0200","caller":"rafthttp/peer.go:128","msg":"starting remote peer","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.491+0200","caller":"rafthttp/pipeline.go:71","msg":"started HTTP pipelining with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.491+0200","caller":"rafthttp/stream.go:406","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-28T23:39:44.491+0200","caller":"rafthttp/stream.go:406","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-28T23:39:44.492+0200","caller":"rafthttp/stream.go:166","msg":"started stream writer with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.492+0200","caller":"rafthttp/peer.go:134","msg":"started remote peer","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.492+0200","caller":"rafthttp/transport.go:327","msg":"added remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f","remote-peer-urls":["https://172.16.0.11:2380"]}
{"level":"info","ts":"2021-07-28T23:39:44.492+0200","caller":"etcdserver/server.go:790","msg":"starting etcd server","local-member-id":"a5ff6d2872667bd3","local-server-version":"3.4.13","cluster-id":"24c8f0b9fbd23be7","cluster-version":"3.4"}
{"level":"info","ts":"2021-07-28T23:39:44.492+0200","caller":"rafthttp/stream.go:406","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.492+0200","caller":"rafthttp/stream.go:166","msg":"started stream writer with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.492+0200","caller":"rafthttp/stream.go:406","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.494+0200","caller":"embed/etcd.go:711","msg":"starting with client TLS","tls-info":"cert = /var/lib/rancher/k3s/server/tls/etcd/server-client.crt, key = /var/lib/rancher/k3s/server/tls/etcd/server-client.key, trusted-ca = /var/lib/rancher/k3s/server/tls/etcd/server-ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
{"level":"info","ts":"2021-07-28T23:39:44.494+0200","caller":"embed/etcd.go:244","msg":"now serving peer/client/metrics","local-member-id":"a5ff6d2872667bd3","initial-advertise-peer-urls":["http://localhost:2380"],"listen-peer-urls":["https://162.55.57.75:2380"],"advertise-client-urls":["https://162.55.57.75:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://162.55.57.75:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
INFO[2021-07-28T23:39:44.494586176+02:00] Running kube-apiserver --advertise-port=6443 --allow-privileged=true --anonymous-auth=false --api-audiences=https://kubernetes.default.svc.cluster.local,k3s --authorization-mode=Node,RBAC --bind-address=127.0.0.1 --cert-dir=/var/lib/rancher/k3s/server/tls/temporary-certs --client-ca-file=/var/lib/rancher/k3s/server/tls/client-ca.crt --enable-admission-plugins=NodeRestriction --etcd-cafile=/var/lib/rancher/k3s/server/tls/etcd/server-ca.crt --etcd-certfile=/var/lib/rancher/k3s/server/tls/etcd/client.crt --etcd-keyfile=/var/lib/rancher/k3s/server/tls/etcd/client.key --etcd-servers=https://127.0.0.1:2379 --feature-gates=ServiceAccountIssuerDiscovery=false --insecure-port=0 --kubelet-certificate-authority=/var/lib/rancher/k3s/server/tls/server-ca.crt --kubelet-client-certificate=/var/lib/rancher/k3s/server/tls/client-kube-apiserver.crt --kubelet-client-key=/var/lib/rancher/k3s/server/tls/client-kube-apiserver.key --profiling=false --proxy-client-cert-file=/var/lib/rancher/k3s/server/tls/client-auth-proxy.crt --proxy-client-key-file=/var/lib/rancher/k3s/server/tls/client-auth-proxy.key --requestheader-allowed-names=system:auth-proxy --requestheader-client-ca-file=/var/lib/rancher/k3s/server/tls/request-header-ca.crt --requestheader-extra-headers-prefix=X-Remote-Extra- --requestheader-group-headers=X-Remote-Group --requestheader-username-headers=X-Remote-User --secure-port=6444 --service-account-issuer=https://kubernetes.default.svc.cluster.local --service-account-key-file=/var/lib/rancher/k3s/server/tls/service.key --service-account-signing-key-file=/var/lib/rancher/k3s/server/tls/service.key --service-cluster-ip-range=10.43.0.0/16 --service-node-port-range=30000-32767 --storage-backend=etcd3 --tls-cert-file=/var/lib/rancher/k3s/server/tls/serving-kube-apiserver.crt --tls-private-key-file=/var/lib/rancher/k3s/server/tls/serving-kube-apiserver.key 
{"level":"info","ts":"2021-07-28T23:39:44.494+0200","caller":"etcdserver/server.go:691","msg":"starting initial election tick advance","election-ticks":10}
{"level":"info","ts":"2021-07-28T23:39:44.495+0200","caller":"embed/etcd.go:579","msg":"serving peer traffic","address":"162.55.57.75:2380"}
{"level":"info","ts":"2021-07-28T23:39:44.495+0200","caller":"embed/etcd.go:781","msg":"serving metrics","address":"http://127.0.0.1:2381"}
{"level":"info","ts":"2021-07-28T23:39:44.664+0200","caller":"raft/raft.go:1530","msg":"a5ff6d2872667bd3 switched to configuration voters=(6610737761517497322 11961399155804765139)"}
{"level":"info","ts":"2021-07-28T23:39:44.664+0200","caller":"membership/cluster.go:422","msg":"removed member","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","removed-remote-peer-id":"fe65734b8888915f","removed-remote-peer-urls":["https://172.16.0.11:2380"]}
{"level":"info","ts":"2021-07-28T23:39:44.664+0200","caller":"rafthttp/peer.go:333","msg":"stopping remote peer","remote-peer-id":"fe65734b8888915f"}
{"level":"warn","ts":"2021-07-28T23:39:44.664+0200","caller":"rafthttp/stream.go:301","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"unknown stream","remote-peer-id":"fe65734b8888915f"}
{"level":"warn","ts":"2021-07-28T23:39:44.664+0200","caller":"rafthttp/stream.go:301","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"unknown stream","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.664+0200","caller":"rafthttp/pipeline.go:86","msg":"stopped HTTP pipelining with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.664+0200","caller":"rafthttp/stream.go:459","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.664+0200","caller":"rafthttp/stream.go:459","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.664+0200","caller":"rafthttp/peer.go:340","msg":"stopped remote peer","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-28T23:39:44.664+0200","caller":"rafthttp/transport.go:369","msg":"removed remote peer","local-member-id":"a5ff6d2872667bd3","removed-remote-peer-id":"fe65734b8888915f"}
INFO[2021-07-28T23:39:45.561249789+02:00] Cluster-Http-Server 2021/07/28 23:39:45 http: TLS handshake error from 172.16.0.51:50260: remote error: tls: bad certificate 
INFO[2021-07-28T23:39:47.571241547+02:00] Cluster-Http-Server 2021/07/28 23:39:47 http: TLS handshake error from 172.16.0.51:50268: remote error: tls: bad certificate 
INFO[2021-07-28T23:39:49.278893948+02:00] Cluster-Http-Server 2021/07/28 23:39:49 http: TLS handshake error from 172.16.0.10:38736: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:39:49.492+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:39:49.492+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:39:49.578398290+02:00] Cluster-Http-Server 2021/07/28 23:39:49 http: TLS handshake error from 172.16.0.51:50276: remote error: tls: bad certificate 
INFO[2021-07-28T23:39:49.822614200+02:00] Cluster-Http-Server 2021/07/28 23:39:49 http: TLS handshake error from 172.16.0.11:60880: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:39:51.480+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4478"}
{"level":"info","ts":"2021-07-28T23:39:51.481+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4479"}
{"level":"info","ts":"2021-07-28T23:39:51.481+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4479"}
{"level":"info","ts":"2021-07-28T23:39:51.481+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4479"}
INFO[2021-07-28T23:39:51.588292897+02:00] Cluster-Http-Server 2021/07/28 23:39:51 http: TLS handshake error from 172.16.0.51:50284: remote error: tls: bad certificate 
INFO[2021-07-28T23:39:53.599736163+02:00] Cluster-Http-Server 2021/07/28 23:39:53 http: TLS handshake error from 172.16.0.51:50292: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:39:54.493+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:39:54.493+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:39:54.495+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
INFO[2021-07-28T23:39:54.495781572+02:00] Failed to test data store connection: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:39:54.496+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
INFO[2021-07-28T23:39:54.790963045+02:00] Cluster-Http-Server 2021/07/28 23:39:54 http: TLS handshake error from 172.16.0.10:38740: remote error: tls: bad certificate 
INFO[2021-07-28T23:39:55.316963311+02:00] Cluster-Http-Server 2021/07/28 23:39:55 http: TLS handshake error from 172.16.0.11:60884: remote error: tls: bad certificate 
INFO[2021-07-28T23:39:55.609466276+02:00] Cluster-Http-Server 2021/07/28 23:39:55 http: TLS handshake error from 172.16.0.51:50300: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:39:57.480+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4479"}
{"level":"info","ts":"2021-07-28T23:39:57.480+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4480"}
{"level":"info","ts":"2021-07-28T23:39:57.480+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4480"}
{"level":"info","ts":"2021-07-28T23:39:57.480+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4480"}
INFO[2021-07-28T23:39:57.619077909+02:00] Cluster-Http-Server 2021/07/28 23:39:57 http: TLS handshake error from 172.16.0.51:50308: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:39:59.494+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:39:59.494+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:39:59.494+0200","caller":"etcdserver/server.go:2068","msg":"failed to publish local member to cluster through raft","local-member-id":"a5ff6d2872667bd3","local-member-attributes":"{Name:k3s-management-1-8d09628b ClientURLs:[https://162.55.57.75:2379]}","request-path":"/0/members/a5ff6d2872667bd3/attributes","publish-timeout":"15s","error":"etcdserver: request timed out"}
INFO[2021-07-28T23:39:59.629267283+02:00] Cluster-Http-Server 2021/07/28 23:39:59 http: TLS handshake error from 172.16.0.51:50316: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:00.289829623+02:00] Cluster-Http-Server 2021/07/28 23:40:00 http: TLS handshake error from 172.16.0.10:38744: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:00.845899345+02:00] Cluster-Http-Server 2021/07/28 23:40:00 http: TLS handshake error from 172.16.0.11:60888: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:01.638053209+02:00] Cluster-Http-Server 2021/07/28 23:40:01 http: TLS handshake error from 172.16.0.51:50324: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:40:02.980+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4480"}
{"level":"info","ts":"2021-07-28T23:40:02.980+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4481"}
{"level":"info","ts":"2021-07-28T23:40:02.980+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4481"}
{"level":"info","ts":"2021-07-28T23:40:02.980+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4481"}
{"level":"warn","ts":"2021-07-28T23:40:04.219+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context deadline exceeded\". Reconnecting..."}
{"level":"warn","ts":"2021-07-28T23:40:04.494+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:04.494+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:40:04.662207395+02:00] Cluster-Http-Server 2021/07/28 23:40:04 http: TLS handshake error from 172.16.0.51:50332: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:05.763009372+02:00] Cluster-Http-Server 2021/07/28 23:40:05 http: TLS handshake error from 172.16.0.10:38748: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:06.375075792+02:00] Cluster-Http-Server 2021/07/28 23:40:06 http: TLS handshake error from 172.16.0.11:60892: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:06.673085876+02:00] Cluster-Http-Server 2021/07/28 23:40:06 http: TLS handshake error from 172.16.0.51:50340: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:08.221+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
ERRO[2021-07-28T23:40:08.221632204+02:00] Failed to check local etcd status for learner management: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:40:08.221+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
INFO[2021-07-28T23:40:08.683422948+02:00] Cluster-Http-Server 2021/07/28 23:40:08 http: TLS handshake error from 172.16.0.51:50348: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:09.495+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:09.495+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:09.496+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
INFO[2021-07-28T23:40:09.496392461+02:00] Failed to test data store connection: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:40:09.497+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
{"level":"info","ts":"2021-07-28T23:40:09.980+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4481"}
{"level":"info","ts":"2021-07-28T23:40:09.980+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4482"}
{"level":"info","ts":"2021-07-28T23:40:09.980+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4482"}
{"level":"info","ts":"2021-07-28T23:40:09.980+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4482"}
INFO[2021-07-28T23:40:10.695035097+02:00] Cluster-Http-Server 2021/07/28 23:40:10 http: TLS handshake error from 172.16.0.51:50356: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:11.271750178+02:00] Cluster-Http-Server 2021/07/28 23:40:11 http: TLS handshake error from 172.16.0.10:38752: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:11.812168559+02:00] Cluster-Http-Server 2021/07/28 23:40:11 http: TLS handshake error from 172.16.0.11:60896: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:12.705720114+02:00] Cluster-Http-Server 2021/07/28 23:40:12 http: TLS handshake error from 172.16.0.51:50364: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:14.495+0200","caller":"etcdserver/server.go:2068","msg":"failed to publish local member to cluster through raft","local-member-id":"a5ff6d2872667bd3","local-member-attributes":"{Name:k3s-management-1-8d09628b ClientURLs:[https://162.55.57.75:2379]}","request-path":"/0/members/a5ff6d2872667bd3/attributes","publish-timeout":"15s","error":"etcdserver: request timed out"}
{"level":"warn","ts":"2021-07-28T23:40:14.495+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:14.495+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:40:14.715663113+02:00] Cluster-Http-Server 2021/07/28 23:40:14 http: TLS handshake error from 172.16.0.51:50372: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:40:15.480+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4482"}
{"level":"info","ts":"2021-07-28T23:40:15.480+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4483"}
{"level":"info","ts":"2021-07-28T23:40:15.480+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4483"}
{"level":"info","ts":"2021-07-28T23:40:15.481+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4483"}
INFO[2021-07-28T23:40:16.723565823+02:00] Cluster-Http-Server 2021/07/28 23:40:16 http: TLS handshake error from 172.16.0.51:50380: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:16.828329390+02:00] Cluster-Http-Server 2021/07/28 23:40:16 http: TLS handshake error from 172.16.0.10:38756: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:17.297566237+02:00] Cluster-Http-Server 2021/07/28 23:40:17 http: TLS handshake error from 172.16.0.11:60900: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:18.734033386+02:00] Cluster-Http-Server 2021/07/28 23:40:18 http: TLS handshake error from 172.16.0.51:50388: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:19.496+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:19.496+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:40:20.748373181+02:00] Cluster-Http-Server 2021/07/28 23:40:20 http: TLS handshake error from 172.16.0.51:50396: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:22.315957906+02:00] Cluster-Http-Server 2021/07/28 23:40:22 http: TLS handshake error from 172.16.0.10:38760: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:22.755779637+02:00] Cluster-Http-Server 2021/07/28 23:40:22 http: TLS handshake error from 172.16.0.51:50404: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:22.821164636+02:00] Cluster-Http-Server 2021/07/28 23:40:22 http: TLS handshake error from 172.16.0.11:60904: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:23.221+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
ERRO[2021-07-28T23:40:23.221480775+02:00] Failed to check local etcd status for learner management: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:40:23.221+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
{"level":"info","ts":"2021-07-28T23:40:23.980+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4483"}
{"level":"info","ts":"2021-07-28T23:40:23.980+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4484"}
{"level":"info","ts":"2021-07-28T23:40:23.980+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4484"}
{"level":"info","ts":"2021-07-28T23:40:23.980+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4484"}
{"level":"warn","ts":"2021-07-28T23:40:24.496+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:24.497+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:24.497+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
INFO[2021-07-28T23:40:24.497790382+02:00] Failed to test data store connection: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:40:24.497+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
INFO[2021-07-28T23:40:24.763366764+02:00] Cluster-Http-Server 2021/07/28 23:40:24 http: TLS handshake error from 172.16.0.51:50412: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:25.946+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context deadline exceeded\". Reconnecting..."}
INFO[2021-07-28T23:40:26.772413046+02:00] Cluster-Http-Server 2021/07/28 23:40:26 http: TLS handshake error from 172.16.0.51:50420: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:27.785426602+02:00] Cluster-Http-Server 2021/07/28 23:40:27 http: TLS handshake error from 172.16.0.10:38764: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:28.308331246+02:00] Cluster-Http-Server 2021/07/28 23:40:28 http: TLS handshake error from 172.16.0.11:60908: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:28.781122834+02:00] Cluster-Http-Server 2021/07/28 23:40:28 http: TLS handshake error from 172.16.0.51:50428: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:29.496+0200","caller":"etcdserver/server.go:2068","msg":"failed to publish local member to cluster through raft","local-member-id":"a5ff6d2872667bd3","local-member-attributes":"{Name:k3s-management-1-8d09628b ClientURLs:[https://162.55.57.75:2379]}","request-path":"/0/members/a5ff6d2872667bd3/attributes","publish-timeout":"15s","error":"etcdserver: request timed out"}
{"level":"warn","ts":"2021-07-28T23:40:29.497+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:29.497+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:40:30.789124702+02:00] Cluster-Http-Server 2021/07/28 23:40:30 http: TLS handshake error from 172.16.0.51:50436: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:32.798496007+02:00] Cluster-Http-Server 2021/07/28 23:40:32 http: TLS handshake error from 172.16.0.51:50444: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:33.341222916+02:00] Cluster-Http-Server 2021/07/28 23:40:33 http: TLS handshake error from 172.16.0.10:38768: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:40:33.480+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4484"}
{"level":"info","ts":"2021-07-28T23:40:33.480+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4485"}
{"level":"info","ts":"2021-07-28T23:40:33.480+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4485"}
{"level":"info","ts":"2021-07-28T23:40:33.480+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4485"}
INFO[2021-07-28T23:40:33.847228893+02:00] Cluster-Http-Server 2021/07/28 23:40:33 http: TLS handshake error from 172.16.0.11:60912: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:34.497+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:34.497+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:40:34.805829391+02:00] Cluster-Http-Server 2021/07/28 23:40:34 http: TLS handshake error from 172.16.0.51:50452: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:36.815252742+02:00] Cluster-Http-Server 2021/07/28 23:40:36 http: TLS handshake error from 172.16.0.51:50460: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:38.221+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
ERRO[2021-07-28T23:40:38.221578401+02:00] Failed to check local etcd status for learner management: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:40:38.221+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
INFO[2021-07-28T23:40:38.782074788+02:00] Cluster-Http-Server 2021/07/28 23:40:38 http: TLS handshake error from 172.16.0.10:38772: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:38.823869921+02:00] Cluster-Http-Server 2021/07/28 23:40:38 http: TLS handshake error from 172.16.0.51:50468: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:39.319426056+02:00] Cluster-Http-Server 2021/07/28 23:40:39 http: TLS handshake error from 172.16.0.11:60916: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:39.498+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:39.498+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:39.498+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
INFO[2021-07-28T23:40:39.498650363+02:00] Failed to test data store connection: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:40:39.498+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
INFO[2021-07-28T23:40:40.831282814+02:00] Cluster-Http-Server 2021/07/28 23:40:40 http: TLS handshake error from 172.16.0.51:50476: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:40:40.980+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4485"}
{"level":"info","ts":"2021-07-28T23:40:40.980+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4486"}
{"level":"info","ts":"2021-07-28T23:40:40.980+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4486"}
{"level":"info","ts":"2021-07-28T23:40:40.980+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4486"}
INFO[2021-07-28T23:40:42.839763194+02:00] Cluster-Http-Server 2021/07/28 23:40:42 http: TLS handshake error from 172.16.0.51:50484: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:44.271541130+02:00] Cluster-Http-Server 2021/07/28 23:40:44 http: TLS handshake error from 172.16.0.10:38776: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:44.496+0200","caller":"etcdserver/server.go:2068","msg":"failed to publish local member to cluster through raft","local-member-id":"a5ff6d2872667bd3","local-member-attributes":"{Name:k3s-management-1-8d09628b ClientURLs:[https://162.55.57.75:2379]}","request-path":"/0/members/a5ff6d2872667bd3/attributes","publish-timeout":"15s","error":"etcdserver: request timed out"}
{"level":"warn","ts":"2021-07-28T23:40:44.498+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:44.498+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:40:44.849843555+02:00] Cluster-Http-Server 2021/07/28 23:40:44 http: TLS handshake error from 172.16.0.51:50492: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:44.904726269+02:00] Cluster-Http-Server 2021/07/28 23:40:44 http: TLS handshake error from 172.16.0.11:60920: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:40:46.480+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4486"}
{"level":"info","ts":"2021-07-28T23:40:46.480+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4487"}
{"level":"info","ts":"2021-07-28T23:40:46.480+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4487"}
{"level":"info","ts":"2021-07-28T23:40:46.480+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4487"}
INFO[2021-07-28T23:40:46.861232865+02:00] Cluster-Http-Server 2021/07/28 23:40:46 http: TLS handshake error from 172.16.0.51:50500: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:48.870935458+02:00] Cluster-Http-Server 2021/07/28 23:40:48 http: TLS handshake error from 172.16.0.51:50508: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:48.920+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context deadline exceeded\". Reconnecting..."}
{"level":"warn","ts":"2021-07-28T23:40:49.499+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:49.499+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:40:49.738441778+02:00] Cluster-Http-Server 2021/07/28 23:40:49 http: TLS handshake error from 172.16.0.10:38780: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:50.326823206+02:00] Cluster-Http-Server 2021/07/28 23:40:50 http: TLS handshake error from 172.16.0.11:60924: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:50.880061005+02:00] Cluster-Http-Server 2021/07/28 23:40:50 http: TLS handshake error from 172.16.0.51:50516: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:52.888567488+02:00] Cluster-Http-Server 2021/07/28 23:40:52 http: TLS handshake error from 172.16.0.51:50524: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:53.221+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
ERRO[2021-07-28T23:40:53.221250815+02:00] Failed to check local etcd status for learner management: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:40:53.221+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
{"level":"info","ts":"2021-07-28T23:40:53.480+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4487"}
{"level":"info","ts":"2021-07-28T23:40:53.480+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4488"}
{"level":"info","ts":"2021-07-28T23:40:53.480+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4488"}
{"level":"info","ts":"2021-07-28T23:40:53.480+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4488"}
{"level":"warn","ts":"2021-07-28T23:40:54.498+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
INFO[2021-07-28T23:40:54.499024253+02:00] Failed to test data store connection: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:40:54.499+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
{"level":"warn","ts":"2021-07-28T23:40:54.499+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:54.499+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:40:54.898980186+02:00] Cluster-Http-Server 2021/07/28 23:40:54 http: TLS handshake error from 172.16.0.51:50532: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:55.282578040+02:00] Cluster-Http-Server 2021/07/28 23:40:55 http: TLS handshake error from 172.16.0.10:38784: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:55.812825794+02:00] Cluster-Http-Server 2021/07/28 23:40:55 http: TLS handshake error from 172.16.0.11:60928: remote error: tls: bad certificate 
INFO[2021-07-28T23:40:56.914537085+02:00] Cluster-Http-Server 2021/07/28 23:40:56 http: TLS handshake error from 172.16.0.51:50540: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:40:58.480+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4488"}
{"level":"info","ts":"2021-07-28T23:40:58.480+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4489"}
{"level":"info","ts":"2021-07-28T23:40:58.480+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4489"}
{"level":"info","ts":"2021-07-28T23:40:58.480+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4489"}
INFO[2021-07-28T23:40:58.924981414+02:00] Cluster-Http-Server 2021/07/28 23:40:58 http: TLS handshake error from 172.16.0.51:50548: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:40:59.496+0200","caller":"etcdserver/server.go:2068","msg":"failed to publish local member to cluster through raft","local-member-id":"a5ff6d2872667bd3","local-member-attributes":"{Name:k3s-management-1-8d09628b ClientURLs:[https://162.55.57.75:2379]}","request-path":"/0/members/a5ff6d2872667bd3/attributes","publish-timeout":"15s","error":"etcdserver: request timed out"}
{"level":"warn","ts":"2021-07-28T23:40:59.500+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:40:59.500+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:41:00.780541183+02:00] Cluster-Http-Server 2021/07/28 23:41:00 http: TLS handshake error from 172.16.0.10:38788: remote error: tls: bad certificate 
INFO[2021-07-28T23:41:00.933932377+02:00] Cluster-Http-Server 2021/07/28 23:41:00 http: TLS handshake error from 172.16.0.51:50556: remote error: tls: bad certificate 
INFO[2021-07-28T23:41:01.327516914+02:00] Cluster-Http-Server 2021/07/28 23:41:01 http: TLS handshake error from 172.16.0.11:60932: remote error: tls: bad certificate 
INFO[2021-07-28T23:41:02.950222112+02:00] Cluster-Http-Server 2021/07/28 23:41:02 http: TLS handshake error from 172.16.0.51:50564: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:41:04.500+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:41:04.500+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:41:04.958676965+02:00] Cluster-Http-Server 2021/07/28 23:41:04 http: TLS handshake error from 172.16.0.51:50572: remote error: tls: bad certificate 
INFO[2021-07-28T23:41:06.262133166+02:00] Cluster-Http-Server 2021/07/28 23:41:06 http: TLS handshake error from 172.16.0.10:38792: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:41:06.480+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4489"}
{"level":"info","ts":"2021-07-28T23:41:06.480+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4490"}
{"level":"info","ts":"2021-07-28T23:41:06.480+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4490"}
{"level":"info","ts":"2021-07-28T23:41:06.480+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4490"}
INFO[2021-07-28T23:41:06.772955418+02:00] Cluster-Http-Server 2021/07/28 23:41:06 http: TLS handshake error from 172.16.0.11:60936: remote error: tls: bad certificate 
INFO[2021-07-28T23:41:06.968262717+02:00] Cluster-Http-Server 2021/07/28 23:41:06 http: TLS handshake error from 172.16.0.51:50580: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:41:08.221+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
ERRO[2021-07-28T23:41:08.221718853+02:00] Failed to check local etcd status for learner management: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:41:08.221+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
INFO[2021-07-28T23:41:08.975516092+02:00] Cluster-Http-Server 2021/07/28 23:41:08 http: TLS handshake error from 172.16.0.51:50588: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:41:09.499+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = DeadlineExceeded desc = context deadline exceeded"}
INFO[2021-07-28T23:41:09.499665558+02:00] Failed to test data store connection: context deadline exceeded 
{"level":"warn","ts":"2021-07-28T23:41:09.499+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
{"level":"warn","ts":"2021-07-28T23:41:09.500+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:41:09.500+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:41:10.983453394+02:00] Cluster-Http-Server 2021/07/28 23:41:10 http: TLS handshake error from 172.16.0.51:50596: remote error: tls: bad certificate 
INFO[2021-07-28T23:41:11.795089376+02:00] Cluster-Http-Server 2021/07/28 23:41:11 http: TLS handshake error from 172.16.0.10:38796: remote error: tls: bad certificate 
INFO[2021-07-28T23:41:12.361515597+02:00] Cluster-Http-Server 2021/07/28 23:41:12 http: TLS handshake error from 172.16.0.11:60940: remote error: tls: bad certificate 
INFO[2021-07-28T23:41:12.990729422+02:00] Cluster-Http-Server 2021/07/28 23:41:12 http: TLS handshake error from 172.16.0.51:50604: remote error: tls: bad certificate 
{"level":"warn","ts":"2021-07-28T23:41:13.423+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context deadline exceeded\". Reconnecting..."}
{"level":"warn","ts":"2021-07-28T23:41:14.497+0200","caller":"etcdserver/server.go:2068","msg":"failed to publish local member to cluster through raft","local-member-id":"a5ff6d2872667bd3","local-member-attributes":"{Name:k3s-management-1-8d09628b ClientURLs:[https://162.55.57.75:2379]}","request-path":"/0/members/a5ff6d2872667bd3/attributes","publish-timeout":"15s","error":"etcdserver: request timed out"}
{"level":"warn","ts":"2021-07-28T23:41:14.501+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_RAFT_MESSAGE","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
{"level":"warn","ts":"2021-07-28T23:41:14.501+0200","caller":"rafthttp/probing_status.go:70","msg":"prober detected unhealthy status","round-tripper-name":"ROUND_TRIPPER_SNAPSHOT","remote-peer-id":"5bbe0ef80a82ebea","rtt":"0s","error":"dial tcp 172.16.0.10:2380: connect: connection refused"}
INFO[2021-07-28T23:41:15.000743663+02:00] Cluster-Http-Server 2021/07/28 23:41:15 http: TLS handshake error from 172.16.0.51:50612: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-28T23:41:15.980+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4490"}
{"level":"info","ts":"2021-07-28T23:41:15.980+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4491"}
{"level":"info","ts":"2021-07-28T23:41:15.980+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4491"}
{"level":"info","ts":"2021-07-28T23:41:15.980+0200","caller":"raft/raft.go:811","msg":"a5ff6d2872667bd3 [logterm: 40, index: 18782950] sent MsgVote request to 5bbe0ef80a82ebea at term 4491"}
INFO[2021-07-28T23:41:17.009981039+02:00] Cluster-Http-Server 2021/07/28 23:41:17 http: TLS handshake error from 172.16.0.51:50620: remote error: tls: bad certificate 
INFO[2021-07-28T23:41:17.290062883+02:00] Cluster-Http-Server 2021/07/28 23:41:17 http: TLS handshake error from 172.16.0.10:38800: remote error: tls: bad certificate 
INFO[2021-07-28T23:41:17.853715472+02:00] Cluster-Http-Server 2021/07/28 23:41:17 http: TLS handshake error from 172.16.0.11:60944: remote error: tls: bad certificate 
^C{"level":"warn","ts":"2021-07-28T23:41:18.229+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = Canceled desc = context canceled"}
INFO[2021-07-28T23:41:18.230176927+02:00] Failed to test data store connection: context canceled 
{"level":"warn","ts":"2021-07-28T23:41:18.230+0200","caller":"clientv3/retry_interceptor.go:62","msg":"retrying of unary invoker failed","target":"passthrough:///https://127.0.0.1:2379","attempt":0,"error":"rpc error: code = Canceled desc = context canceled"}
ERRO[2021-07-28T23:41:18.230840911+02:00] Failed to check local etcd status for learner management: context canceled 
{"level":"warn","ts":"2021-07-28T23:41:18.231+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: authentication handshake failed: context canceled\". Reconnecting..."}
FATA[2021-07-28T23:41:18.232319895+02:00] server stopped: http: Server closed          
root@k3s-management-1:~# 
brandond commented 3 years ago

Can you try upgrading to v1.20.9+k3s1 or newer, and try resetting again? There was a regression with cluster-reset not functioning properly during quorum loss. https://github.com/k3s-io/k3s/releases/tag/v1.20.9%2Bk3s1

You should also make sure that the other servers (if any) are also stopped, so that they're not constantly failing and restarting and trying to rejoin the cluster while you're resetting things.

pandarun commented 3 years ago

@brandond i did binary update, but it looks the same - bad certificates

root@k3s-management-1:~# mv ./k3s /usr/local/bin/k3s
root@k3s-management-1:~# k3s -v
k3s version v1.21.3+k3s1 (1d1f220f)
go version go1.16.6
root@k3s-management-1:~# k3s server --cluster-reset
INFO[0000] Acquiring lock file /var/lib/rancher/k3s/data/.lock 
INFO[0000] Preparing data dir /var/lib/rancher/k3s/data/9df574741d2573cbbe6616e8624488b36b3340d077bc50da7cb167f1b08a64d1 
INFO[2021-07-29T09:57:50.108270964+02:00] Starting k3s v1.21.3+k3s1 (1d1f220f)         
INFO[2021-07-29T09:57:50.110065315+02:00] Managed etcd cluster bootstrap already complete and initialized 
INFO[2021-07-29T09:57:50.155702941+02:00] certificate CN=k3s-cloud-controller-manager signed by CN=k3s-client-ca@1621795019: notBefore=2021-05-23 18:36:59 +0000 UTC notAfter=2022-07-29 07:57:50 +0000 UTC 
INFO[2021-07-29T09:57:50.158909969+02:00] certificate CN=kube-apiserver signed by CN=k3s-server-ca@1621795019: notBefore=2021-05-23 18:36:59 +0000 UTC notAfter=2022-07-29 07:57:50 +0000 UTC 
{"level":"info","ts":"2021-07-29T09:57:50.176+0200","caller":"embed/etcd.go:117","msg":"configuring peer listeners","listen-peer-urls":["https://162.55.57.75:2380"]}
{"level":"info","ts":"2021-07-29T09:57:50.177+0200","caller":"embed/etcd.go:468","msg":"starting with peer TLS","tls-info":"cert = /var/lib/rancher/k3s/server/tls/etcd/peer-server-client.crt, key = /var/lib/rancher/k3s/server/tls/etcd/peer-server-client.key, trusted-ca = /var/lib/rancher/k3s/server/tls/etcd/peer-ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
{"level":"info","ts":"2021-07-29T09:57:50.179+0200","caller":"embed/etcd.go:127","msg":"configuring client listeners","listen-client-urls":["https://127.0.0.1:2379","https://162.55.57.75:2379"]}
{"level":"info","ts":"2021-07-29T09:57:50.180+0200","caller":"embed/etcd.go:302","msg":"starting an etcd server","etcd-version":"3.4.13","git-sha":"Not provided (use ./build instead of go build)","go-version":"go1.16.6","go-os":"linux","go-arch":"amd64","max-cpu-set":2,"max-cpu-available":2,"member-initialized":true,"name":"k3s-management-1-5327d6a9","data-dir":"/var/lib/rancher/k3s/server/db/etcd","wal-dir":"","wal-dir-dedicated":"","member-dir":"/var/lib/rancher/k3s/server/db/etcd/member","force-new-cluster":true,"heartbeat-interval":"500ms","election-timeout":"5s","initial-election-tick-advance":true,"snapshot-count":100000,"snapshot-catchup-entries":5000,"initial-advertise-peer-urls":["https://162.55.57.75:2380"],"listen-peer-urls":["https://162.55.57.75:2380"],"advertise-client-urls":["https://162.55.57.75:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://162.55.57.75:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"],"cors":["*"],"host-whitelist":["*"],"initial-cluster":"","initial-cluster-state":"new","initial-cluster-token":"","quota-size-bytes":2147483648,"pre-vote":false,"initial-corrupt-check":false,"corrupt-check-time-interval":"0s","auto-compaction-mode":"","auto-compaction-retention":"0s","auto-compaction-interval":"0s","discovery-url":"","discovery-proxy":""}
{"level":"warn","ts":"2021-07-29T09:57:50.183+0200","caller":"grpclog/grpclog.go:60","msg":"grpc: addrConn.createTransport failed to connect to {https://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = \"transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused\". Reconnecting..."}
{"level":"info","ts":"2021-07-29T09:57:50.188+0200","caller":"etcdserver/backend.go:80","msg":"opened backend db","path":"/var/lib/rancher/k3s/server/db/etcd/member/snap/db","took":"8.454415ms"}
INFO[2021-07-29T09:57:50.781518539+02:00] Cluster-Http-Server 2021/07/29 09:57:50 http: TLS handshake error from 172.16.0.51:57164: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-29T09:57:51.137+0200","caller":"etcdserver/server.go:451","msg":"recovered v2 store from snapshot","snapshot-index":18700197,"snapshot-size":"18 kB"}
{"level":"info","ts":"2021-07-29T09:57:51.140+0200","caller":"mvcc/kvstore.go:380","msg":"restored last compact revision","meta-bucket-name":"meta","meta-bucket-name-key":"finishedCompactRev","restored-compact-revision":15207359}
{"level":"info","ts":"2021-07-29T09:57:51.147+0200","caller":"etcdserver/server.go:469","msg":"recovered v3 backend from snapshot","backend-size-bytes":23179264,"backend-size":"23 MB","backend-size-in-use-bytes":10567680,"backend-size-in-use":"11 MB"}
{"level":"info","ts":"2021-07-29T09:57:51.441+0200","caller":"etcdserver/raft.go:594","msg":"discarding uncommitted WAL entries","entry-index":18782948,"commit-index-from-wal":18782947,"number-of-discarded-entries":3}
{"level":"info","ts":"2021-07-29T09:57:51.445+0200","caller":"etcdserver/raft.go:632","msg":"forcing restart member","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","commit-index":18782948}
{"level":"info","ts":"2021-07-29T09:57:51.450+0200","caller":"raft/raft.go:1530","msg":"a5ff6d2872667bd3 switched to configuration voters=(6610737761517497322 11961399155804765139 18331184626531864927)"}
{"level":"info","ts":"2021-07-29T09:57:51.450+0200","caller":"raft/raft.go:700","msg":"a5ff6d2872667bd3 became follower at term 4491"}
{"level":"info","ts":"2021-07-29T09:57:51.450+0200","caller":"raft/raft.go:383","msg":"newRaft a5ff6d2872667bd3 [peers: [5bbe0ef80a82ebea,a5ff6d2872667bd3,fe65734b8888915f], term: 4491, commit: 18782948, applied: 18700197, lastindex: 18782948, lastterm: 4491]"}
{"level":"info","ts":"2021-07-29T09:57:51.450+0200","caller":"api/capability.go:76","msg":"enabled capabilities for version","cluster-version":"3.4"}
{"level":"info","ts":"2021-07-29T09:57:51.451+0200","caller":"membership/cluster.go:256","msg":"recovered/added member from store","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","recovered-remote-peer-id":"5bbe0ef80a82ebea","recovered-remote-peer-urls":["https://172.16.0.10:2380"]}
{"level":"info","ts":"2021-07-29T09:57:51.451+0200","caller":"membership/cluster.go:256","msg":"recovered/added member from store","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","recovered-remote-peer-id":"a5ff6d2872667bd3","recovered-remote-peer-urls":["https://172.16.0.5:2380"]}
{"level":"info","ts":"2021-07-29T09:57:51.451+0200","caller":"membership/cluster.go:256","msg":"recovered/added member from store","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","recovered-remote-peer-id":"fe65734b8888915f","recovered-remote-peer-urls":["https://172.16.0.11:2380"]}
{"level":"info","ts":"2021-07-29T09:57:51.451+0200","caller":"membership/cluster.go:269","msg":"set cluster version from store","cluster-version":"3.4"}
{"level":"warn","ts":"2021-07-29T09:57:51.452+0200","caller":"auth/store.go:1366","msg":"simple token is not cryptographically signed"}
{"level":"info","ts":"2021-07-29T09:57:51.454+0200","caller":"mvcc/kvstore.go:380","msg":"restored last compact revision","meta-bucket-name":"meta","meta-bucket-name-key":"finishedCompactRev","restored-compact-revision":15207359}
{"level":"info","ts":"2021-07-29T09:57:51.481+0200","caller":"etcdserver/quota.go:98","msg":"enabled backend quota with default value","quota-name":"v3-applier","quota-size-bytes":2147483648,"quota-size":"2.1 GB"}
{"level":"info","ts":"2021-07-29T09:57:51.483+0200","caller":"rafthttp/peer.go:128","msg":"starting remote peer","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.483+0200","caller":"rafthttp/pipeline.go:71","msg":"started HTTP pipelining with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.485+0200","caller":"rafthttp/peer.go:134","msg":"started remote peer","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.485+0200","caller":"rafthttp/transport.go:327","msg":"added remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea","remote-peer-urls":["https://172.16.0.10:2380"]}
{"level":"info","ts":"2021-07-29T09:57:51.485+0200","caller":"rafthttp/peer.go:128","msg":"starting remote peer","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.485+0200","caller":"rafthttp/pipeline.go:71","msg":"started HTTP pipelining with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.486+0200","caller":"rafthttp/stream.go:166","msg":"started stream writer with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.486+0200","caller":"rafthttp/stream.go:166","msg":"started stream writer with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.486+0200","caller":"rafthttp/stream.go:406","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.486+0200","caller":"rafthttp/stream.go:406","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.486+0200","caller":"rafthttp/stream.go:166","msg":"started stream writer with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.486+0200","caller":"rafthttp/stream.go:166","msg":"started stream writer with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.486+0200","caller":"rafthttp/stream.go:406","msg":"started stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.486+0200","caller":"rafthttp/peer.go:134","msg":"started remote peer","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.486+0200","caller":"rafthttp/transport.go:327","msg":"added remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f","remote-peer-urls":["https://172.16.0.11:2380"]}
{"level":"info","ts":"2021-07-29T09:57:51.487+0200","caller":"etcdserver/server.go:790","msg":"starting etcd server","local-member-id":"a5ff6d2872667bd3","local-server-version":"3.4.13","cluster-id":"24c8f0b9fbd23be7","cluster-version":"3.4"}
{"level":"info","ts":"2021-07-29T09:57:51.486+0200","caller":"rafthttp/stream.go:406","msg":"started stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.487+0200","caller":"etcdserver/server.go:691","msg":"starting initial election tick advance","election-ticks":10}
{"level":"info","ts":"2021-07-29T09:57:51.490+0200","caller":"embed/etcd.go:711","msg":"starting with client TLS","tls-info":"cert = /var/lib/rancher/k3s/server/tls/etcd/server-client.crt, key = /var/lib/rancher/k3s/server/tls/etcd/server-client.key, trusted-ca = /var/lib/rancher/k3s/server/tls/etcd/server-ca.crt, client-cert-auth = true, crl-file = ","cipher-suites":[]}
{"level":"info","ts":"2021-07-29T09:57:51.491+0200","caller":"embed/etcd.go:244","msg":"now serving peer/client/metrics","local-member-id":"a5ff6d2872667bd3","initial-advertise-peer-urls":["https://162.55.57.75:2380"],"listen-peer-urls":["https://162.55.57.75:2380"],"advertise-client-urls":["https://162.55.57.75:2379"],"listen-client-urls":["https://127.0.0.1:2379","https://162.55.57.75:2379"],"listen-metrics-urls":["http://127.0.0.1:2381"]}
INFO[2021-07-29T09:57:51.491469639+02:00] Running kube-apiserver --advertise-port=6443 --allow-privileged=true --anonymous-auth=false --api-audiences=https://kubernetes.default.svc.cluster.local,k3s --authorization-mode=Node,RBAC --bind-address=127.0.0.1 --cert-dir=/var/lib/rancher/k3s/server/tls/temporary-certs --client-ca-file=/var/lib/rancher/k3s/server/tls/client-ca.crt --enable-admission-plugins=NodeRestriction --etcd-cafile=/var/lib/rancher/k3s/server/tls/etcd/server-ca.crt --etcd-certfile=/var/lib/rancher/k3s/server/tls/etcd/client.crt --etcd-keyfile=/var/lib/rancher/k3s/server/tls/etcd/client.key --etcd-servers=https://127.0.0.1:2379 --insecure-port=0 --kubelet-certificate-authority=/var/lib/rancher/k3s/server/tls/server-ca.crt --kubelet-client-certificate=/var/lib/rancher/k3s/server/tls/client-kube-apiserver.crt --kubelet-client-key=/var/lib/rancher/k3s/server/tls/client-kube-apiserver.key --profiling=false --proxy-client-cert-file=/var/lib/rancher/k3s/server/tls/client-auth-proxy.crt --proxy-client-key-file=/var/lib/rancher/k3s/server/tls/client-auth-proxy.key --requestheader-allowed-names=system:auth-proxy --requestheader-client-ca-file=/var/lib/rancher/k3s/server/tls/request-header-ca.crt --requestheader-extra-headers-prefix=X-Remote-Extra- --requestheader-group-headers=X-Remote-Group --requestheader-username-headers=X-Remote-User --secure-port=6444 --service-account-issuer=https://kubernetes.default.svc.cluster.local --service-account-key-file=/var/lib/rancher/k3s/server/tls/service.key --service-account-signing-key-file=/var/lib/rancher/k3s/server/tls/service.key --service-cluster-ip-range=10.43.0.0/16 --service-node-port-range=30000-32767 --storage-backend=etcd3 --tls-cert-file=/var/lib/rancher/k3s/server/tls/serving-kube-apiserver.crt --tls-private-key-file=/var/lib/rancher/k3s/server/tls/serving-kube-apiserver.key 
{"level":"info","ts":"2021-07-29T09:57:51.491+0200","caller":"embed/etcd.go:579","msg":"serving peer traffic","address":"162.55.57.75:2380"}
{"level":"info","ts":"2021-07-29T09:57:51.491+0200","caller":"embed/etcd.go:781","msg":"serving metrics","address":"http://127.0.0.1:2381"}
{"level":"info","ts":"2021-07-29T09:57:51.704+0200","caller":"raft/raft.go:1530","msg":"a5ff6d2872667bd3 switched to configuration voters=(6610737761517497322 11961399155804765139)"}
{"level":"info","ts":"2021-07-29T09:57:51.705+0200","caller":"membership/cluster.go:422","msg":"removed member","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","removed-remote-peer-id":"fe65734b8888915f","removed-remote-peer-urls":["https://172.16.0.11:2380"]}
{"level":"info","ts":"2021-07-29T09:57:51.705+0200","caller":"rafthttp/peer.go:333","msg":"stopping remote peer","remote-peer-id":"fe65734b8888915f"}
{"level":"warn","ts":"2021-07-29T09:57:51.705+0200","caller":"rafthttp/stream.go:301","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"unknown stream","remote-peer-id":"fe65734b8888915f"}
{"level":"warn","ts":"2021-07-29T09:57:51.705+0200","caller":"rafthttp/stream.go:301","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"unknown stream","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.705+0200","caller":"rafthttp/pipeline.go:86","msg":"stopped HTTP pipelining with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.705+0200","caller":"rafthttp/stream.go:459","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.705+0200","caller":"rafthttp/stream.go:459","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.705+0200","caller":"rafthttp/peer.go:340","msg":"stopped remote peer","remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.705+0200","caller":"rafthttp/transport.go:369","msg":"removed remote peer","local-member-id":"a5ff6d2872667bd3","removed-remote-peer-id":"fe65734b8888915f"}
{"level":"info","ts":"2021-07-29T09:57:51.707+0200","caller":"raft/raft.go:1530","msg":"a5ff6d2872667bd3 switched to configuration voters=(11961399155804765139)"}
{"level":"info","ts":"2021-07-29T09:57:51.707+0200","caller":"membership/cluster.go:422","msg":"removed member","cluster-id":"24c8f0b9fbd23be7","local-member-id":"a5ff6d2872667bd3","removed-remote-peer-id":"5bbe0ef80a82ebea","removed-remote-peer-urls":["https://172.16.0.10:2380"]}
{"level":"info","ts":"2021-07-29T09:57:51.707+0200","caller":"rafthttp/peer.go:333","msg":"stopping remote peer","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"warn","ts":"2021-07-29T09:57:51.707+0200","caller":"rafthttp/stream.go:301","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"unknown stream","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"warn","ts":"2021-07-29T09:57:51.707+0200","caller":"rafthttp/stream.go:301","msg":"stopped TCP streaming connection with remote peer","stream-writer-type":"unknown stream","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.707+0200","caller":"rafthttp/pipeline.go:86","msg":"stopped HTTP pipelining with remote peer","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.707+0200","caller":"rafthttp/stream.go:459","msg":"stopped stream reader with remote peer","stream-reader-type":"stream MsgApp v2","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.708+0200","caller":"rafthttp/stream.go:459","msg":"stopped stream reader with remote peer","stream-reader-type":"stream Message","local-member-id":"a5ff6d2872667bd3","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.708+0200","caller":"rafthttp/peer.go:340","msg":"stopped remote peer","remote-peer-id":"5bbe0ef80a82ebea"}
{"level":"info","ts":"2021-07-29T09:57:51.708+0200","caller":"rafthttp/transport.go:369","msg":"removed remote peer","local-member-id":"a5ff6d2872667bd3","removed-remote-peer-id":"5bbe0ef80a82ebea"}
INFO[2021-07-29T09:57:52.790953065+02:00] Cluster-Http-Server 2021/07/29 09:57:52 http: TLS handshake error from 172.16.0.51:57172: remote error: tls: bad certificate 
INFO[2021-07-29T09:57:54.799473172+02:00] Cluster-Http-Server 2021/07/29 09:57:54 http: TLS handshake error from 172.16.0.51:57180: remote error: tls: bad certificate 
INFO[2021-07-29T09:57:56.808024226+02:00] Cluster-Http-Server 2021/07/29 09:57:56 http: TLS handshake error from 172.16.0.51:57188: remote error: tls: bad certificate 
{"level":"info","ts":"2021-07-29T09:57:58.451+0200","caller":"raft/raft.go:923","msg":"a5ff6d2872667bd3 is starting a new election at term 4491"}
{"level":"info","ts":"2021-07-29T09:57:58.451+0200","caller":"raft/raft.go:713","msg":"a5ff6d2872667bd3 became candidate at term 4492"}
{"level":"info","ts":"2021-07-29T09:57:58.451+0200","caller":"raft/raft.go:824","msg":"a5ff6d2872667bd3 received MsgVoteResp from a5ff6d2872667bd3 at term 4492"}
{"level":"info","ts":"2021-07-29T09:57:58.451+0200","caller":"raft/raft.go:765","msg":"a5ff6d2872667bd3 became leader at term 4492"}
{"level":"info","ts":"2021-07-29T09:57:58.451+0200","caller":"raft/node.go:325","msg":"raft.node: a5ff6d2872667bd3 elected leader a5ff6d2872667bd3 at term 4492"}
{"level":"info","ts":"2021-07-29T09:57:58.453+0200","caller":"etcdserver/server.go:2039","msg":"published local member to cluster through raft","local-member-id":"a5ff6d2872667bd3","local-member-attributes":"{Name:k3s-management-1-5327d6a9 ClientURLs:[https://162.55.57.75:2379]}","request-path":"/0/members/a5ff6d2872667bd3/attributes","cluster-id":"24c8f0b9fbd23be7","publish-timeout":"15s"}
{"level":"info","ts":"2021-07-29T09:57:58.454+0200","caller":"embed/serve.go:191","msg":"serving client traffic securely","address":"162.55.57.75:2379"}
{"level":"info","ts":"2021-07-29T09:57:58.455+0200","caller":"embed/serve.go:191","msg":"serving client traffic securely","address":"127.0.0.1:2379"}
INFO[2021-07-29T09:57:58.466954145+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:57:58.816468970+02:00] Cluster-Http-Server 2021/07/29 09:57:58 http: TLS handshake error from 172.16.0.51:57196: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:00.825246305+02:00] Cluster-Http-Server 2021/07/29 09:58:00 http: TLS handshake error from 172.16.0.51:57204: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:02.838660348+02:00] Cluster-Http-Server 2021/07/29 09:58:02 http: TLS handshake error from 172.16.0.51:57212: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:03.472818782+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:04.848618228+02:00] Cluster-Http-Server 2021/07/29 09:58:04 http: TLS handshake error from 172.16.0.51:57220: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:06.856783897+02:00] Cluster-Http-Server 2021/07/29 09:58:06 http: TLS handshake error from 172.16.0.51:57228: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:08.476937882+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:08.864304219+02:00] Cluster-Http-Server 2021/07/29 09:58:08 http: TLS handshake error from 172.16.0.51:57236: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:10.875420854+02:00] Cluster-Http-Server 2021/07/29 09:58:10 http: TLS handshake error from 172.16.0.51:57244: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:12.887213933+02:00] Cluster-Http-Server 2021/07/29 09:58:12 http: TLS handshake error from 172.16.0.51:57252: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:13.486527988+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:14.895080265+02:00] Cluster-Http-Server 2021/07/29 09:58:14 http: TLS handshake error from 172.16.0.51:57260: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:16.919541959+02:00] Cluster-Http-Server 2021/07/29 09:58:16 http: TLS handshake error from 172.16.0.51:57268: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:18.500788562+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:18.930693370+02:00] Cluster-Http-Server 2021/07/29 09:58:18 http: TLS handshake error from 172.16.0.51:57276: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:20.940753738+02:00] Cluster-Http-Server 2021/07/29 09:58:20 http: TLS handshake error from 172.16.0.51:57284: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:22.950883424+02:00] Cluster-Http-Server 2021/07/29 09:58:22 http: TLS handshake error from 172.16.0.51:57292: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:23.507634328+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:24.960091467+02:00] Cluster-Http-Server 2021/07/29 09:58:24 http: TLS handshake error from 172.16.0.51:57300: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:26.966987645+02:00] Cluster-Http-Server 2021/07/29 09:58:26 http: TLS handshake error from 172.16.0.51:57308: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:28.514233810+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:28.976415500+02:00] Cluster-Http-Server 2021/07/29 09:58:28 http: TLS handshake error from 172.16.0.51:57316: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:30.986735002+02:00] Cluster-Http-Server 2021/07/29 09:58:30 http: TLS handshake error from 172.16.0.51:57324: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:32.995090377+02:00] Cluster-Http-Server 2021/07/29 09:58:32 http: TLS handshake error from 172.16.0.51:57332: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:33.519466982+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:35.003351021+02:00] Cluster-Http-Server 2021/07/29 09:58:35 http: TLS handshake error from 172.16.0.51:57340: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:37.014541163+02:00] Cluster-Http-Server 2021/07/29 09:58:37 http: TLS handshake error from 172.16.0.51:57348: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:38.524965126+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:39.022173900+02:00] Cluster-Http-Server 2021/07/29 09:58:39 http: TLS handshake error from 172.16.0.51:57356: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:41.029111942+02:00] Cluster-Http-Server 2021/07/29 09:58:41 http: TLS handshake error from 172.16.0.51:57364: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:43.037134841+02:00] Cluster-Http-Server 2021/07/29 09:58:43 http: TLS handshake error from 172.16.0.51:57372: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:43.530855265+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:45.045274417+02:00] Cluster-Http-Server 2021/07/29 09:58:45 http: TLS handshake error from 172.16.0.51:57380: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:47.056717943+02:00] Cluster-Http-Server 2021/07/29 09:58:47 http: TLS handshake error from 172.16.0.51:57388: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:48.536413251+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:49.067259127+02:00] Cluster-Http-Server 2021/07/29 09:58:49 http: TLS handshake error from 172.16.0.51:57396: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:51.074598634+02:00] Cluster-Http-Server 2021/07/29 09:58:51 http: TLS handshake error from 172.16.0.51:57404: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:53.084479224+02:00] Cluster-Http-Server 2021/07/29 09:58:53 http: TLS handshake error from 172.16.0.51:57412: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:53.543657179+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:55.094524167+02:00] Cluster-Http-Server 2021/07/29 09:58:55 http: TLS handshake error from 172.16.0.51:57420: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:57.103472864+02:00] Cluster-Http-Server 2021/07/29 09:58:57 http: TLS handshake error from 172.16.0.51:57428: remote error: tls: bad certificate 
INFO[2021-07-29T09:58:58.552410171+02:00] Failed to test data store connection: this server is a not a member of the etcd cluster. Found [k3s-management-1-5327d6a9=https://172.16.0.5:2380], expect: k3s-management-1-5327d6a9=162.55.57.75 
INFO[2021-07-29T09:58:59.113476500+02:00] Cluster-Http-Server 2021/07/29 09:58:59 http: TLS handshake error from 172.16.0.51:57436: remote error: tls: bad certificate 
pandarun commented 3 years ago

@brandond thank you so much! it started working again.