kubernetes-sigs / kubespray

Deploy a Production Ready Kubernetes Cluster
Apache License 2.0
16.16k stars 6.48k forks source link

"Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]" #10946

Closed luozc5 closed 3 months ago

luozc5 commented 8 months ago

What happened?

I used kubesparay (tag 2.24.0) to deploy a single-node k8s cluster and everything worked fine. But when I restarted the machine, the entire k8s cluster was out of order.

root@node1:/home/ubuntu# kubectl get pods -A -o wide NAMESPACE NAME READY STATUS RESTARTS AGE IP NODE NOMINATED NODE READINESS GATES ingress-nginx ingress-nginx-controller-9wqqw 0/1 CrashLoopBackOff 14 (3m47s ago) 16h 10.233.102.144 node1 kube-system calico-kube-controllers-648dffd99-8qk84 0/1 CrashLoopBackOff 13 (89s ago) 16h 10.233.102.142 node1 kube-system calico-node-d7s5k 0/1 Completed 0 16h 172.18.6.107 node1 kube-system coredns-77f7cc69db-bq7gq 0/1 Running 2 (84m ago) 16h 10.233.102.141 node1 kube-system dns-autoscaler-595558c478-rhnw5 1/1 Running 2 (84m ago) 16h 10.233.102.146 node1 kube-system kube-apiserver-node1 0/1 Running 3 (84m ago) 16h 172.18.6.107 node1 kube-system kube-controller-manager-node1 0/1 Running 4 (84m ago) 16h 172.18.6.107 node1 kube-system kube-proxy-w5527 1/1 Running 2 (84m ago) 16h 172.18.6.107 node1 kube-system kube-scheduler-node1 0/1 Running 4 (84m ago) 16h 172.18.6.107 node1 kube-system metrics-server-bd6df7764-tl6cm 0/1 CrashLoopBackOff 13 (4m12s ago) 16h 10.233.102.143 node1 local-path-storage local-path-provisioner-f78b6cbbc-gfwk8 0/1 CrashLoopBackOff 14 (4m46s ago) 16h 10.233.102.145 node1

When I looked at apiserver's log, I found that it was frantically printing the following logs:

E0223 03:00:36.836614 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]" E0223 03:00:37.036443 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]" E0223 03:00:37.241868 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]" E0223 03:00:37.435932 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]" E0223 03:00:37.636295 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]" E0223 03:00:37.835913 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]" E0223 03:00:38.037226 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]" E0223 03:00:38.235701 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]" E0223 03:00:38.436108 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]" E0223 03:00:38.637042 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]"

What is the reason? Thanks.

What did you expect to happen?

After the machine restarts, the k8s cluster runs normally

How can we reproduce it (as minimally and precisely as possible)?

Restart the machine

OS

Linux 5.15.0-94-generic x86_64 PRETTY_NAME="Ubuntu 22.04.3 LTS" NAME="Ubuntu" VERSION_ID="22.04" VERSION="22.04.3 LTS (Jammy Jellyfish)" VERSION_CODENAME=jammy ID=ubuntu ID_LIKE=debian HOME_URL="https://www.ubuntu.com/" SUPPORT_URL="https://help.ubuntu.com/" BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" UBUNTU_CODENAME=jammy

Version of Ansible

ansible [core 2.15.9] config file = /home/ubuntu/kubespray-2.24.0/ansible.cfg configured module search path = ['/home/ubuntu/kubespray-2.24.0/library'] ansible python module location = /home/ubuntu/kubespray-venv/lib/python3.10/site-packages/ansible ansible collection location = /root/.ansible/collections:/usr/share/ansible/collections executable location = /home/ubuntu/kubespray-venv/bin/ansible python version = 3.10.12 (main, Nov 20 2023, 15:14:05) [GCC 11.4.0] (/home/ubuntu/kubespray-venv/bin/python3) jinja version = 3.1.2 libyaml = True

Version of Python

Python 3.10.12

Version of Kubespray (commit)

tag 2.24.0

Network plugin used

calico

Full inventory with variables

(kubespray-venv) root@node1:/home/ubuntu/kubespray-2.24.0# ansible -i inventory/mycluster/hosts.yaml all -m debug -a "var=hostvars[node1]" [WARNING]: Skipping callback plugin 'ara_default', unable to load node1 | SUCCESS => { "hostvars[node1]": "VARIABLE IS NOT DEFINED!" }

Command used to invoke ansible

ansible-playbook -i inventory/mycluster/hosts.yaml --become --become-user=root cluster.yml -vvv

Output of ansible run

TASK [network_plugin/calico : Check if inventory match current cluster configuration] **** task path: /home/ubuntu/kubespray-2.24.0/roles/network_plugin/calico/tasks/check.yml:168 ok: [node1] => { "changed": false, "msg": "All assertions passed" } Thursday 22 February 2024 10:52:24 +0000 (0:00:00.115) 0:45:58.431 Thursday 22 February 2024 10:52:24 +0000 (0:00:00.086) 0:45:58.517 Thursday 22 February 2024 10:52:24 +0000 (0:00:00.076) 0:45:58.594 *****

PLAY RECAP *** localhost : ok=3 changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 node1 : ok=776 changed=150 unreachable=0 failed=0 skipped=1168 rescued=0 ignored=6

Thursday 22 February 2024 10:52:24 +0000 (0:00:00.162) 0:45:58.757 *****

container-engine/crictl : Download_file | Download item ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 1653.88s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_file.yml:88 --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_file | Download item --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 58.88s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_file.yml:88 --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_file | Download item --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 44.42s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_file.yml:88 --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- container-engine/docker : Ensure docker packages are installed ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 36.50s /home/ubuntu/kubespray-2.24.0/roles/container-engine/docker/tasks/main.yml:105 -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_file | Download item --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 35.33s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_file.yml:88 --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_file | Download item --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 34.87s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_file.yml:88 --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- network_plugin/calico : Check if calico ready ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 32.24s /home/ubuntu/kubespray-2.24.0/roles/network_plugin/calico/tasks/check.yml:53 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_file | Download item --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 29.65s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_file.yml:88 --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_container | Download image if required --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 28.33s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_container.yml:57 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_container | Download image if required --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 27.08s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_container.yml:57 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_file | Download item --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 25.58s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_file.yml:88 --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_container | Download image if required --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 21.33s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_container.yml:57 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_container | Download image if required --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 20.14s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_container.yml:57 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_container | Download image if required --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 19.47s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_container.yml:57 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- container-engine/cri-dockerd : Download_file | Download item ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 19.37s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_file.yml:88 --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_container | Download image if required --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 18.91s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_container.yml:57 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- kubernetes/control-plane : Kubeadm | Initialize first master ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 17.17s /home/ubuntu/kubespray-2.24.0/roles/kubernetes/control-plane/tasks/kubeadm-setup.yml:178 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_container | Download image if required --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 16.57s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_container.yml:57 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- download : Download_container | Download image if required --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 14.90s /home/ubuntu/kubespray-2.24.0/roles/download/tasks/download_container.yml:57 ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- etcdctl_etcdutl : Copy etcdctl and etcdutl binary from docker container -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 13.74s /home/ubuntu/kubespray-2.24.0/roles/etcdctl_etcdutl/tasks/main.yml:2 ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ (kubespray-venv) root@ubuntu:/home/ubuntu/kubespray-2.24.0#

Anything else we need to know

No response

zhangguanzhang commented 8 months ago
grep Time /var/log/sys.log
luozc5 commented 8 months ago

@zhangguanzhang Thank you for your reply. Over the weekend, the cluster was back to normal, and then I restarted the machine again with the same problem. I tried restarting the kube-apiserver pod manually after the problem occurred again, but I still had the same problem.

root@node1:/home/ubuntu# grep Time /var/log/syslog Feb 26 02:54:57 node1 systemd[3932970]: Reached target Timers. Feb 26 02:55:43 node1 systemd[1]: Stopped target Timer Units. Feb 26 10:03:35 node1 kernel: [ 0.091679] ACPI: PM-Timer IO Port: 0x1008 Feb 26 10:03:35 node1 systemd[1]: Starting Network Time Synchronization... Feb 26 10:03:35 node1 systemd[1]: Started Network Time Synchronization. Feb 26 10:03:35 node1 systemd[1]: Reached target System Time Set. Feb 26 10:03:35 node1 systemd[1]: Condition check resulted in Timer to automatically fetch and run repair assertions being skipped. Feb 26 10:03:35 node1 systemd[1]: Condition check resulted in Ubuntu Pro Timer for running repeated jobs being skipped. Feb 26 10:03:35 node1 systemd[1]: Reached target Timer Units. Feb 26 10:03:37 node1 systemd[885]: Reached target Timers. Feb 26 10:03:41 node1 cri-dockerd[1106]: time="2024-02-26T10:03:41Z" level=info msg="Docker Info: &{ID:BENZ:JLAQ:ZS2L:YKAH:IOE6:C7IG:APYH:SX42:EEQF:QQLK:XCYH:7MKM Containers:43 ContainersRunning:0 ContainersPaused:0 ContainersStopped:43 Images:16 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Native Overlay Diff true] [userxattr false]] SystemStatus:[] Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization:[] Log:[awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:false KernelMemoryTCP:false CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:true BridgeNfIP6tables:true Debug:false NFd:25 OomKillDisable:false NGoroutines:34 SystemTime:2024-02-26T10:03:41.429680742Z LoggingDriver:json-file CgroupDriver:systemd CgroupVersion:2 NEventsListener:0 KernelVersion:5.15.0-97-generic OperatingSystem:Ubuntu 22.04.3 LTS OSVersion:22.04 OSType:linux Architecture:x86_64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:0xc0004cf260 NCPU:1 MemTotal:33653272576 GenericResources:[] DockerRootDir:/var/lib/docker HTTPProxy: HTTPSProxy: NoProxy: Name:node1 Labels:[] ExperimentalBuild:false ServerVersion:20.10.20 ClusterStore: ClusterAdvertise: Runtimes:map[io.containerd.runc.v2:{Path:runc Args:[] Shim:} io.containerd.runtime.v1.linux:{Path:runc Args:[] Shim:} runc:{Path:runc Args:[] Shim:}] DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:[] Nodes:0 Managers:0 Cluster: Warnings:[]} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:31aa4358a36870b21a992d3ad2bef29e1d693bec Expected:31aa4358a36870b21a992d3ad2bef29e1d693bec} RuncCommit:{ID:v1.1.4-0-g5fd4c4d Expected:v1.1.4-0-g5fd4c4d} InitCommit:{ID:de40ad0 Expected:de40ad0} SecurityOptions:[name=apparmor name=seccomp,profile=default name=cgroupns] ProductLicense: DefaultAddressPools:[] Warnings:[]}" Feb 26 10:03:47 node1 systemd[1]: Starting Time & Date Service... Feb 26 10:03:48 node1 systemd[1]: Started Time & Date Service. Feb 26 10:03:56 node1 kubelet[1355]: I0226 10:03:56.550028 1355 manager.go:210] Machine: {Timestamp:2024-02-26 10:03:56.549473698 +0000 UTC m=+0.133291932 CPUVendorID:GenuineIntel NumCores:1 NumPhysicalCores:1 NumSockets:1 CpuFrequency:2294609 MemoryCapacity:33653272576 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:c179c8117ef740a39003b79a15d39ca7 SystemUUID:8abf4d56-ed5c-28c2-c9ab-85080adc4478 BootID:d466c79c-8590-4d78-9f04-74ad36c7ae1f Filesystems:[{Device:/dev/mapper/ubuntu--vg-ubuntu--lv DeviceMajor:253 DeviceMinor:0 Capacity:66257006592 Type:vfs Inodes:4128768 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:27 Capacity:16826634240 Type:vfs Inodes:4108065 HasInodes:true} {Device:/run/lock DeviceMajor:0 DeviceMinor:28 Capacity:5242880 Type:vfs Inodes:4108065 HasInodes:true} {Device:/dev/sda2 DeviceMajor:8 DeviceMinor:2 Capacity:2040373248 Type:vfs Inodes:131072 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:44 Capacity:3365326848 Type:vfs Inodes:821613 HasInodes:true} {Device:/run/snapd/ns DeviceMajor:0 DeviceMinor:25 Capacity:3365330944 Type:vfs Inodes:4108065 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:25 Capacity:3365330944 Type:vfs Inodes:4108065 HasInodes:true}] DiskMap:map[253:0:{Name:dm-0 Major:253 Minor:0 Size:67641540608 Scheduler:none} 8:0:{Name:sda Major:8 Minor:0 Size:137438953472 Scheduler:mq-deadline}] NetworkDevices:[{Name:ens160 MacAddress:00:0c:29:dc:44:78 Speed:10000 Mtu:1500}] Topology:[{Id:0 Memory:33653272576 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0}] Caches:[{Id:0 Size:11534336 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 26 10:03:56 node1 kubelet[1355]: E0226 10:03:56.648834 1355 event.go:289] Unable to write event: '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"node1.17b7618bc6b5ab4f", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, InvolvedObject:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"node1", UID:"node1", APIVersion:"", ResourceVersion:"", FieldPath:""}, Reason:"Starting", Message:"Starting kubelet.", Source:v1.EventSource{Component:"kubelet", Host:"node1"}, FirstTimestamp:time.Date(2024, time.February, 26, 10, 3, 56, 572535631, time.Local), LastTimestamp:time.Date(2024, time.February, 26, 10, 3, 56, 572535631, time.Local), Count:1, Type:"Normal", EventTime:time.Date(1, time.January, 1, 0, 0, 0, 0, time.UTC), Series:(v1.EventSeries)(nil), Action:"", Related:(v1.ObjectReference)(nil), ReportingController:"kubelet", ReportingInstance:"node1"}': 'Post "https://127.0.0.1:6443/api/v1/namespaces/default/events": dial tcp 127.0.0.1:6443: connect: connection refused'(may retry after sleeping) Feb 26 02:56:46 node1 cri-dockerd[1106]: 2024-02-26 02:56:46.705 [WARNING][2256] k8s.go 540: CNI_CONTAINERID does not match WorkloadEndpoint ConainerID, don't delete WEP. ContainerID="30cf477c850253ac882ad9a6b5eaf469bbfaa8253f460ff7ff79174a7a6b874c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-coredns--77f7cc69db--bq7gq-eth0", GenerateName:"coredns-77f7cc69db-", Namespace:"kube-system", SelfLink:"", UID:"e028061b-faa2-473d-8cf1-059481abb7f5", ResourceVersion:"438655", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 50, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"77f7cc69db", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"5ceec8fbff30a2567dfd19f91e8b00882e76ec7776c79253abe0bcf8b97bb3f6", Pod:"coredns-77f7cc69db-bq7gq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali11226f087b9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:48 node1 cri-dockerd[1106]: 2024-02-26 02:56:48.097 [INFO][2422] k8s.go 383: Populated endpoint ContainerID="ddaf0f769afb5045ce1f7db36ae5efe103c3229d44217e746961356b807d8d73" Namespace="kube-system" Pod="metrics-server-bd6df7764-tl6cm" WorkloadEndpoint="node1-k8s-metrics--server--bd6df7764--tl6cm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-metrics--server--bd6df7764--tl6cm-eth0", GenerateName:"metrics-server-bd6df7764-", Namespace:"kube-system", SelfLink:"", UID:"8971fa7f-c0fb-4c49-8979-f55f55eb3205", ResourceVersion:"438653", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 51, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"metrics-server", "pod-template-hash":"bd6df7764", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"metrics-server", "version":"v0.6.4"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"7164acba58908369d39a829ecd44beb914597b4a14d84a046f3e92a463025d14", Pod:"metrics-server-bd6df7764-tl6cm", Endpoint:"eth0", ServiceAccountName:"metrics-server", IPNetworks:[]string{"10.233.102.147/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.metrics-server"}, InterfaceName:"cali2a2a364c19e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"https", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x280a, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:48 node1 cri-dockerd[1106]: 2024-02-26 02:56:48.140 [INFO][2422] k8s.go 411: Added Mac, interface name, and active container ID to endpoint ContainerID="ddaf0f769afb5045ce1f7db36ae5efe103c3229d44217e746961356b807d8d73" Namespace="kube-system" Pod="metrics-server-bd6df7764-tl6cm" WorkloadEndpoint="node1-k8s-metrics--server--bd6df7764--tl6cm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-metrics--server--bd6df7764--tl6cm-eth0", GenerateName:"metrics-server-bd6df7764-", Namespace:"kube-system", SelfLink:"", UID:"8971fa7f-c0fb-4c49-8979-f55f55eb3205", ResourceVersion:"438653", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 51, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"metrics-server", "pod-template-hash":"bd6df7764", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"metrics-server", "version":"v0.6.4"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"ddaf0f769afb5045ce1f7db36ae5efe103c3229d44217e746961356b807d8d73", Pod:"metrics-server-bd6df7764-tl6cm", Endpoint:"eth0", ServiceAccountName:"metrics-server", IPNetworks:[]string{"10.233.102.147/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.metrics-server"}, InterfaceName:"cali2a2a364c19e", MAC:"2e:14:74:d8:bb:11", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"https", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x280a, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:48 node1 cri-dockerd[1106]: 2024-02-26 02:56:48.470 [INFO][2457] k8s.go 383: Populated endpoint ContainerID="a8f39f556229e8fb2348ceb3f786337e4356e8078f9d1241a7943987d2388414" Namespace="kube-system" Pod="calico-kube-controllers-648dffd99-8qk84" WorkloadEndpoint="node1-k8s-calico--kube--controllers--648dffd99--8qk84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-calico--kube--controllers--648dffd99--8qk84-eth0", GenerateName:"calico-kube-controllers-648dffd99-", Namespace:"kube-system", SelfLink:"", UID:"b8d17f7a-f04a-4d5f-9224-11ab3a6dcbe4", ResourceVersion:"438654", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 49, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"k8s-app":"calico-kube-controllers", "pod-template-hash":"648dffd99", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"9d353e265fe87078d14fd82b0ba937b22d657ef6efcf92d7ec7613ebd3f95b82", Pod:"calico-kube-controllers-648dffd99-8qk84", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"10.233.102.148/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.calico-kube-controllers"}, InterfaceName:"cali1ee7dd51e2a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:48 node1 cri-dockerd[1106]: 2024-02-26 02:56:48.506 [INFO][2457] k8s.go 411: Added Mac, interface name, and active container ID to endpoint ContainerID="a8f39f556229e8fb2348ceb3f786337e4356e8078f9d1241a7943987d2388414" Namespace="kube-system" Pod="calico-kube-controllers-648dffd99-8qk84" WorkloadEndpoint="node1-k8s-calico--kube--controllers--648dffd99--8qk84-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-calico--kube--controllers--648dffd99--8qk84-eth0", GenerateName:"calico-kube-controllers-648dffd99-", Namespace:"kube-system", SelfLink:"", UID:"b8d17f7a-f04a-4d5f-9224-11ab3a6dcbe4", ResourceVersion:"438654", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 49, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"k8s-app":"calico-kube-controllers", "pod-template-hash":"648dffd99", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"a8f39f556229e8fb2348ceb3f786337e4356e8078f9d1241a7943987d2388414", Pod:"calico-kube-controllers-648dffd99-8qk84", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"10.233.102.148/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.calico-kube-controllers"}, InterfaceName:"cali1ee7dd51e2a", MAC:"ea:64:d3:ff:07:21", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:48 node1 cri-dockerd[1106]: 2024-02-26 02:56:48.819 [INFO][2542] k8s.go 383: Populated endpoint ContainerID="f42e825aac90fc27d4b33970ff9856e16e059cdc70e327afec1ccf7cf28b7023" Namespace="kube-system" Pod="coredns-77f7cc69db-bq7gq" WorkloadEndpoint="node1-k8s-coredns--77f7cc69db--bq7gq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-coredns--77f7cc69db--bq7gq-eth0", GenerateName:"coredns-77f7cc69db-", Namespace:"kube-system", SelfLink:"", UID:"e028061b-faa2-473d-8cf1-059481abb7f5", ResourceVersion:"438655", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 50, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"77f7cc69db", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"5ceec8fbff30a2567dfd19f91e8b00882e76ec7776c79253abe0bcf8b97bb3f6", Pod:"coredns-77f7cc69db-bq7gq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"10.233.102.149/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali11226f087b9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:48 node1 cri-dockerd[1106]: 2024-02-26 02:56:48.850 [INFO][2542] k8s.go 411: Added Mac, interface name, and active container ID to endpoint ContainerID="f42e825aac90fc27d4b33970ff9856e16e059cdc70e327afec1ccf7cf28b7023" Namespace="kube-system" Pod="coredns-77f7cc69db-bq7gq" WorkloadEndpoint="node1-k8s-coredns--77f7cc69db--bq7gq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-coredns--77f7cc69db--bq7gq-eth0", GenerateName:"coredns-77f7cc69db-", Namespace:"kube-system", SelfLink:"", UID:"e028061b-faa2-473d-8cf1-059481abb7f5", ResourceVersion:"438655", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 50, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"77f7cc69db", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"f42e825aac90fc27d4b33970ff9856e16e059cdc70e327afec1ccf7cf28b7023", Pod:"coredns-77f7cc69db-bq7gq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"10.233.102.149/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali11226f087b9", MAC:"4e:a7:f4:24:29:f0", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:49 node1 cri-dockerd[1106]: 2024-02-26 02:56:49.502 [INFO][2700] k8s.go 383: Populated endpoint ContainerID="0c34d2c27499407061dbc1cb16f2f5ffab4eb97632eb4abc3375f6ecf0e46b8e" Namespace="local-path-storage" Pod="local-path-provisioner-f78b6cbbc-gfwk8" WorkloadEndpoint="node1-k8s-local--path--provisioner--f78b6cbbc--gfwk8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-local--path--provisioner--f78b6cbbc--gfwk8-eth0", GenerateName:"local-path-provisioner-f78b6cbbc-", Namespace:"local-path-storage", SelfLink:"", UID:"0a17de58-168b-4efa-a515-932eed526705", ResourceVersion:"438656", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 49, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"app":"local-path-provisioner", "pod-template-hash":"f78b6cbbc", "projectcalico.org/namespace":"local-path-storage", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"local-path-provisioner-service-account"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"39e15073379e65602630b1d83b1bb32c70fde753216bf379c42f1a66119e3bc2", Pod:"local-path-provisioner-f78b6cbbc-gfwk8", Endpoint:"eth0", ServiceAccountName:"local-path-provisioner-service-account", IPNetworks:[]string{"10.233.102.150/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.local-path-storage", "ksa.local-path-storage.local-path-provisioner-service-account"}, InterfaceName:"cali890167efdbb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:49 node1 cri-dockerd[1106]: 2024-02-26 02:56:49.529 [INFO][2700] k8s.go 411: Added Mac, interface name, and active container ID to endpoint ContainerID="0c34d2c27499407061dbc1cb16f2f5ffab4eb97632eb4abc3375f6ecf0e46b8e" Namespace="local-path-storage" Pod="local-path-provisioner-f78b6cbbc-gfwk8" WorkloadEndpoint="node1-k8s-local--path--provisioner--f78b6cbbc--gfwk8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-local--path--provisioner--f78b6cbbc--gfwk8-eth0", GenerateName:"local-path-provisioner-f78b6cbbc-", Namespace:"local-path-storage", SelfLink:"", UID:"0a17de58-168b-4efa-a515-932eed526705", ResourceVersion:"438656", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 49, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"app":"local-path-provisioner", "pod-template-hash":"f78b6cbbc", "projectcalico.org/namespace":"local-path-storage", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"local-path-provisioner-service-account"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"0c34d2c27499407061dbc1cb16f2f5ffab4eb97632eb4abc3375f6ecf0e46b8e", Pod:"local-path-provisioner-f78b6cbbc-gfwk8", Endpoint:"eth0", ServiceAccountName:"local-path-provisioner-service-account", IPNetworks:[]string{"10.233.102.150/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.local-path-storage", "ksa.local-path-storage.local-path-provisioner-service-account"}, InterfaceName:"cali890167efdbb", MAC:"f2:38:17:0f:1a:88", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:49 node1 cri-dockerd[1106]: 2024-02-26 02:56:49.714 [INFO][2710] k8s.go 383: Populated endpoint ContainerID="6a600efebb5d75b147dcf341a009ab6e7b84355786969432e75f6c9d75ebdd8b" Namespace="ingress-nginx" Pod="ingress-nginx-controller-9wqqw" WorkloadEndpoint="node1-k8s-ingress--nginx--controller--9wqqw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-ingress--nginx--controller--9wqqw-eth0", GenerateName:"ingress-nginx-controller-", Namespace:"ingress-nginx", SelfLink:"", UID:"e7f3654a-1599-4679-b193-42bd18b1127c", ResourceVersion:"438657", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 49, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"ingress-nginx", "app.kubernetes.io/part-of":"ingress-nginx", "controller-revision-hash":"79598d894d", "pod-template-generation":"1", "projectcalico.org/namespace":"ingress-nginx", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"ingress-nginx"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"de28a87642423640c0784b5c4eeb135ac10a19529eaa7deb9b62e4be57494203", Pod:"ingress-nginx-controller-9wqqw", Endpoint:"eth0", ServiceAccountName:"ingress-nginx", IPNetworks:[]string{"10.233.102.151/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.ingress-nginx", "ksa.ingress-nginx.ingress-nginx"}, InterfaceName:"calie33ba227812", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"http", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x50, HostPort:0x50, HostIP:""}, v3.WorkloadEndpointPort{Name:"https", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1bb, HostPort:0x1bb, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x280e, HostPort:0x280e, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:49 node1 cri-dockerd[1106]: 2024-02-26 02:56:49.764 [INFO][2710] k8s.go 411: Added Mac, interface name, and active container ID to endpoint ContainerID="6a600efebb5d75b147dcf341a009ab6e7b84355786969432e75f6c9d75ebdd8b" Namespace="ingress-nginx" Pod="ingress-nginx-controller-9wqqw" WorkloadEndpoint="node1-k8s-ingress--nginx--controller--9wqqw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-ingress--nginx--controller--9wqqw-eth0", GenerateName:"ingress-nginx-controller-", Namespace:"ingress-nginx", SelfLink:"", UID:"e7f3654a-1599-4679-b193-42bd18b1127c", ResourceVersion:"438657", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 49, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"ingress-nginx", "app.kubernetes.io/part-of":"ingress-nginx", "controller-revision-hash":"79598d894d", "pod-template-generation":"1", "projectcalico.org/namespace":"ingress-nginx", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"ingress-nginx"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"6a600efebb5d75b147dcf341a009ab6e7b84355786969432e75f6c9d75ebdd8b", Pod:"ingress-nginx-controller-9wqqw", Endpoint:"eth0", ServiceAccountName:"ingress-nginx", IPNetworks:[]string{"10.233.102.151/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.ingress-nginx", "ksa.ingress-nginx.ingress-nginx"}, InterfaceName:"calie33ba227812", MAC:"4e:a6:5b:3c:b3:a6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"http", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x50, HostPort:0x50, HostIP:""}, v3.WorkloadEndpointPort{Name:"https", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1bb, HostPort:0x1bb, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x280e, HostPort:0x280e, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:51 node1 cri-dockerd[1106]: 2024-02-26 02:56:51.604 [WARNING][1431] k8s.go 540: CNI_CONTAINERID does not match WorkloadEndpoint ConainerID, don't delete WEP. ContainerID="b0cc65d7161837b67eb1576dd2a4d67891928f7c125d9445dfddc25c0200925c" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-dns--autoscaler--595558c478--rhnw5-eth0", GenerateName:"dns-autoscaler-595558c478-", Namespace:"kube-system", SelfLink:"", UID:"60ee04b2-4376-44b0-8951-8af68132da6d", ResourceVersion:"438618", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 50, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"k8s-app":"dns-autoscaler", "pod-template-hash":"595558c478", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"dns-autoscaler"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"724544dfa81b991b583d4b1f748f30ef2373477cefdeb751c1125852cc730598", Pod:"dns-autoscaler-595558c478-rhnw5", Endpoint:"eth0", ServiceAccountName:"dns-autoscaler", IPNetworks:[]string{"10.233.102.146/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.dns-autoscaler"}, InterfaceName:"calif3b39a87dd2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:54 node1 cri-dockerd[1106]: 2024-02-26 02:56:54.781 [INFO][3226] k8s.go 383: Populated endpoint ContainerID="862613ef424cb53083bdf84c907c24e2bc635971f13b3e78480fb9d3ff6d4d3b" Namespace="kube-system" Pod="dns-autoscaler-595558c478-rhnw5" WorkloadEndpoint="node1-k8s-dns--autoscaler--595558c478--rhnw5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-dns--autoscaler--595558c478--rhnw5-eth0", GenerateName:"dns-autoscaler-595558c478-", Namespace:"kube-system", SelfLink:"", UID:"60ee04b2-4376-44b0-8951-8af68132da6d", ResourceVersion:"438712", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 50, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(int64)(nil), Labels:map[string]string{"k8s-app":"dns-autoscaler", "pod-template-hash":"595558c478", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"dns-autoscaler"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"724544dfa81b991b583d4b1f748f30ef2373477cefdeb751c1125852cc730598", Pod:"dns-autoscaler-595558c478-rhnw5", Endpoint:"eth0", ServiceAccountName:"dns-autoscaler", IPNetworks:[]string{"10.233.102.152/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.dns-autoscaler"}, InterfaceName:"calif3b39a87dd2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 26 02:56:54 node1 cri-dockerd[1106]: 2024-02-26 02:56:54.816 [INFO][3226] k8s.go 411: Added Mac, interface name, and active container ID to endpoint ContainerID="862613ef424cb53083bdf84c907c24e2bc635971f13b3e78480fb9d3ff6d4d3b" Namespace="kube-system" Pod="dns-autoscaler-595558c478-rhnw5" WorkloadEndpoint="node1-k8s-dns--autoscaler--595558c478--rhnw5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"node1-k8s-dns--autoscaler--595558c478--rhnw5-eth0", GenerateName:"dns-autoscaler-595558c478-", Namespace:"kube-system", SelfLink:"", UID:"60ee04b2-4376-44b0-8951-8af68132da6d", ResourceVersion:"438712", Generation:0, CreationTimestamp:time.Date(2024, time.February, 22, 10, 50, 21, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"dns-autoscaler", "pod-template-hash":"595558c478", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"dns-autoscaler"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"node1", ContainerID:"862613ef424cb53083bdf84c907c24e2bc635971f13b3e78480fb9d3ff6d4d3b", Pod:"dns-autoscaler-595558c478-rhnw5", Endpoint:"eth0", ServiceAccountName:"dns-autoscaler", IPNetworks:[]string{"10.233.102.152/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.dns-autoscaler"}, InterfaceName:"calif3b39a87dd2", MAC:"46:6b:fb:63:9d:1e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}}

k8s-triage-robot commented 5 months ago

The Kubernetes project currently lacks enough contributors to adequately respond to all issues.

This bot triages un-triaged issues according to the following rules:

You can:

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle stale

k8s-triage-robot commented 4 months ago

The Kubernetes project currently lacks enough active contributors to adequately respond to all issues.

This bot triages un-triaged issues according to the following rules:

You can:

Please send feedback to sig-contributor-experience at kubernetes/community.

/lifecycle rotten

k8s-triage-robot commented 3 months ago

The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs.

This bot triages issues according to the following rules:

You can:

Please send feedback to sig-contributor-experience at kubernetes/community.

/close not-planned

k8s-ci-robot commented 3 months ago

@k8s-triage-robot: Closing this issue, marking it as "Not Planned".

In response to [this](https://github.com/kubernetes-sigs/kubespray/issues/10946#issuecomment-2249346605): >The Kubernetes project currently lacks enough active contributors to adequately respond to all issues and PRs. > >This bot triages issues according to the following rules: >- After 90d of inactivity, `lifecycle/stale` is applied >- After 30d of inactivity since `lifecycle/stale` was applied, `lifecycle/rotten` is applied >- After 30d of inactivity since `lifecycle/rotten` was applied, the issue is closed > >You can: >- Reopen this issue with `/reopen` >- Mark this issue as fresh with `/remove-lifecycle rotten` >- Offer to help out with [Issue Triage][1] > >Please send feedback to sig-contributor-experience at [kubernetes/community](https://github.com/kubernetes/community). > >/close not-planned > >[1]: https://www.kubernetes.dev/docs/guide/issue-triage/ Instructions for interacting with me using PR comments are available [here](https://git.k8s.io/community/contributors/guide/pull-requests.md). If you have questions or suggestions related to my behavior, please file an issue against the [kubernetes-sigs/prow](https://github.com/kubernetes-sigs/prow/issues/new?title=Prow%20issue:) repository.
smilena2012 commented 3 months ago

I have the same issue with K8 v1.27.7 (3 master nodes). After restart of any nodes pods in kube-system are starting during 0.5 hour. And there are an errors in kube-api-server logs E0812 06:45:53.245948 1 authentication.go:73] "Unable to authenticate the request" err="[invalid bearer token, service account token is not valid yet]"

image

I checked ntp credentials, time on all 3 nodes - it's equal. Tested disks - it's working fine.

smilena2012 commented 3 months ago

/reopen

k8s-ci-robot commented 3 months ago

@smilena2012: You can't reopen an issue/PR unless you authored it or you are a collaborator.

In response to [this](https://github.com/kubernetes-sigs/kubespray/issues/10946#issuecomment-2283250180): >/reopen Instructions for interacting with me using PR comments are available [here](https://git.k8s.io/community/contributors/guide/pull-requests.md). If you have questions or suggestions related to my behavior, please file an issue against the [kubernetes-sigs/prow](https://github.com/kubernetes-sigs/prow/issues/new?title=Prow%20issue:) repository.