dyrnq / kubernetes-ansible

Setup a high availability kubernetes cluster with Ansible
0 stars 0 forks source link

dial tcp 10.10.10.10:8443: connect: connection refused #5

Open dyrnq opened 2 years ago

dyrnq commented 2 years ago
E0921 10:40:47.450989  233225 controller.go:187] failed to update lease, error: Put "https://apiserver.k8s.local:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/worker-2?timeout=10s": write tcp 10.10.10.10:55186->10.10.10.10:8443: write: connection reset by peer
E0921 10:40:47.451786  233225 controller.go:187] failed to update lease, error: Put "https://apiserver.k8s.local:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/worker-2?timeout=10s": dial tcp 10.10.10.10:8443: connect: connection refused
E0921 10:40:47.452166  233225 controller.go:187] failed to update lease, error: Put "https://apiserver.k8s.local:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/worker-2?timeout=10s": dial tcp 10.10.10.10:8443: connect: connection refused
E0921 10:40:47.453661  233225 controller.go:187] failed to update lease, error: Put "https://apiserver.k8s.local:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/worker-2?timeout=10s": dial tcp 10.10.10.10:8443: connect: connection refused
E0921 10:40:47.454098  233225 controller.go:187] failed to update lease, error: Put "https://apiserver.k8s.local:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/worker-2?timeout=10s": dial tcp 10.10.10.10:8443: connect: connection refused
E0921 10:40:47.454450  233225 controller.go:144] failed to ensure lease exists, will retry in 200ms, error: Get "https://apiserver.k8s.local:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/worker-2?timeout=10s": dial tcp 10.10.10.10:8443: connect: connection refused
E0921 10:40:47.655910  233225 controller.go:144] failed to ensure lease exists, will retry in 400ms, error: Get "https://apiserver.k8s.local:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/worker-2?timeout=10s": dial tcp 10.10.10.10:8443: connect: connection refused
E0921 10:40:48.058659  233225 controller.go:144] failed to ensure lease exists, will retry in 800ms, error: Get "https://apiserver.k8s.local:8443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/worker-2?timeout=10s": dial tcp 10.10.10.10:8443: connect: connection refused
E0921 10:40:48.444100  233225 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://apiserver.k8s.local:8443/api/v1/nodes?fieldSelector=metadata.name%3Dworker-2&resourceVersion=238949": dial tcp 10.10.10.10:8443: connect: connection refused
E0921 10:40:48.507289  233225 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://apiserver.k8s.local:8443/apis/node.k8s.io/v1/runtimeclasses?resourceVersion=238668": dial tcp 10.10.10.10:8443: connect: connection refused
dyrnq commented 2 years ago

kube-proxy need args --ipvs-exclude-cidrs=10.10.10.10/32