Open Dominik-Robert opened 5 years ago
@Dominik-Robert weave-npc does not use IPALLOC_RANGE in any way and has no hard-coding to handle 10.32.0.0/12 any differently.
WARN: 2019/07/26 08:50:52.580450 UDP connection from 10.150.14.5:53734 to 10.32.1.172:161 blocked by Weave NPC.
Are you seeing this log message just at the pod start time or any time a pod is trying to access 10.32.1.172? Can you try to access 10.32.1.172 from the pod?
I get this error mesage at any time when the pod is trying to access 10.32.1.172. If I try this with a ping in the pod I get a timeout for this request.
I was not able to reproduce this issue. Can you please share output of iptables -t filter -L -n -v
?
Sure here is the Output
Chain INPUT (policy ACCEPT 2119 packets, 617K bytes)
pkts bytes target prot opt in out source destination
2138M 879G KUBE-SERVICES all -- * * 0.0.0.0/0 0.0.0.0/0 ctstate NEW /* kubernetes service portals */
2138M 879G KUBE-EXTERNAL-SERVICES all -- * * 0.0.0.0/0 0.0.0.0/0 ctstate NEW /* kubernetes externally-visible service portals */
2295M 961G KUBE-FIREWALL all -- * * 0.0.0.0/0 0.0.0.0/0
51M 5188M WEAVE-NPC-EGRESS all -- weave * 0.0.0.0/0 0.0.0.0/0
Chain FORWARD (policy ACCEPT 0 packets, 0 bytes)
pkts bytes target prot opt in out source destination
14G 2662G WEAVE-NPC-EGRESS all -- weave * 0.0.0.0/0 0.0.0.0/0 /* NOTE: this must go before '-j KUBE-FORWARD' */
13G 3914G WEAVE-NPC all -- * weave 0.0.0.0/0 0.0.0.0/0 /* NOTE: this must go before '-j KUBE-FORWARD' */
32M 2930M NFLOG all -- * weave 0.0.0.0/0 0.0.0.0/0 state NEW nflog-group 86
32M 2956M DROP all -- * weave 0.0.0.0/0 0.0.0.0/0
1302M 107G ACCEPT all -- weave !weave 0.0.0.0/0 0.0.0.0/0
0 0 ACCEPT all -- * weave 0.0.0.0/0 0.0.0.0/0 ctstate RELATED,ESTABLISHED
7874 7929K KUBE-FORWARD all -- * * 0.0.0.0/0 0.0.0.0/0 /* kubernetes forwarding rules */
73 4893 KUBE-SERVICES all -- * * 0.0.0.0/0 0.0.0.0/0 ctstate NEW /* kubernetes service portals */
7874 7929K DOCKER-USER all -- * * 0.0.0.0/0 0.0.0.0/0
7874 7929K DOCKER-ISOLATION-STAGE-1 all -- * * 0.0.0.0/0 0.0.0.0/0
3166 7001K ACCEPT all -- * docker0 0.0.0.0/0 0.0.0.0/0 ctstate RELATED,ESTABLISHED
0 0 DOCKER all -- * docker0 0.0.0.0/0 0.0.0.0/0
2897 774K ACCEPT all -- docker0 !docker0 0.0.0.0/0 0.0.0.0/0
0 0 ACCEPT all -- docker0 docker0 0.0.0.0/0 0.0.0.0/0
Chain OUTPUT (policy ACCEPT 2011 packets, 433K bytes)
pkts bytes target prot opt in out source destination
1840M 493G KUBE-SERVICES all -- * * 0.0.0.0/0 0.0.0.0/0 ctstate NEW /* kubernetes service portals */
2003M 1140G KUBE-FIREWALL all -- * * 0.0.0.0/0 0.0.0.0/0
Chain DOCKER (1 references)
pkts bytes target prot opt in out source destination
0 0 ACCEPT tcp -- !docker0 docker0 0.0.0.0/0 172.17.0.2 tcp dpt:6901
Chain DOCKER-ISOLATION-STAGE-1 (1 references)
pkts bytes target prot opt in out source destination
2897 774K DOCKER-ISOLATION-STAGE-2 all -- docker0 !docker0 0.0.0.0/0 0.0.0.0/0
7874 7929K RETURN all -- * * 0.0.0.0/0 0.0.0.0/0
Chain DOCKER-ISOLATION-STAGE-2 (1 references)
pkts bytes target prot opt in out source destination
0 0 DROP all -- * docker0 0.0.0.0/0 0.0.0.0/0
2897 774K RETURN all -- * * 0.0.0.0/0 0.0.0.0/0
Chain DOCKER-USER (1 references)
pkts bytes target prot opt in out source destination
7874 7929K RETURN all -- * * 0.0.0.0/0 0.0.0.0/0
Chain KUBE-EXTERNAL-SERVICES (1 references)
pkts bytes target prot opt in out source destination
0 0 REJECT tcp -- * * 0.0.0.0/0 0.0.0.0/0 /* jenkins:db has no endpoints */ ADDRTYPE match dst-type LOCAL tcp dpt:30070 reject-with icmp-port-unreachable
Chain KUBE-FIREWALL (2 references)
pkts bytes target prot opt in out source destination
0 0 DROP all -- * * 0.0.0.0/0 0.0.0.0/0 /* kubernetes firewall for dropping marked packets */ mark match 0x8000/0x8000
Chain KUBE-FORWARD (1 references)
pkts bytes target prot opt in out source destination
0 0 ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 /* kubernetes forwarding rules */ mark match 0x4000/0x4000
Chain KUBE-SERVICES (3 references)
pkts bytes target prot opt in out source destination
0 0 REJECT tcp -- * * 0.0.0.0/0 10.150.27.109 /* jenkins:db has no endpoints */ tcp dpt:8080 reject-with icmp-port-unreachable
Chain WEAVE-NPC (1 references)
pkts bytes target prot opt in out source destination
11G 3788G ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 state RELATED,ESTABLISHED
220M 11G ACCEPT all -- * * 0.0.0.0/0 224.0.0.0/4
1307M 112G ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 PHYSDEV match --physdev-out vethwe-bridge --physdev-is-bridged
35M 3088M WEAVE-NPC-DEFAULT all -- * * 0.0.0.0/0 0.0.0.0/0 state NEW
32M 2943M WEAVE-NPC-INGRESS all -- * * 0.0.0.0/0 0.0.0.0/0 state NEW
Chain WEAVE-NPC-DEFAULT (1 references)
pkts bytes target prot opt in out source destination
0 0 ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-Rzff}h:=]JaaJl/G;(XJpGjZ[ dst /* DefaultAllow ingress isolation for namespace: kube-public */
482 28920 ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-iLgO^}{o=U/*%KE[@=W:l~|9T dst /* DefaultAllow ingress isolation for namespace: ingress-nginx */
28230 2451K ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-P.B|!ZhkAr5q=XZ?3}tMBA+0 dst /* DefaultAllow ingress isolation for namespace: kube-system */
2043K 123M ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-;rGqyMIl1HN^cfDki~Z$3]6!N dst /* DefaultAllow ingress isolation for namespace: default */
0 0 ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-}H)Sitb9aiccB*c*A2(YXzg9v dst /* DefaultAllow ingress isolation for namespace: elk */
0 0 ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-]B*(W?)t*z5O17G044[gUo#$l dst /* DefaultAllow ingress isolation for namespace: kube-node-lease */
0 0 ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-rU?B%Whj8+|zYyTnSYaO#C)r9 dst /* DefaultAllow ingress isolation for namespace: omd-worker */
322K 19M ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-IK?NZ={sWTaB:Z2.?Yf:+|(.l dst /* DefaultAllow ingress isolation for namespace: elastic-system */
Chain WEAVE-NPC-EGRESS (2 references)
pkts bytes target prot opt in out source destination
11G 2434G ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 state RELATED,ESTABLISHED
145M 7327M RETURN all -- * * 0.0.0.0/0 0.0.0.0/0 PHYSDEV match --physdev-in vethwe-bridge --physdev-is-bridged
228K 14M RETURN all -- * * 0.0.0.0/0 0.0.0.0/0 ADDRTYPE match dst-type LOCAL
78M 3901M RETURN all -- * * 0.0.0.0/0 224.0.0.0/4
2642M 222G WEAVE-NPC-EGRESS-DEFAULT all -- * * 0.0.0.0/0 0.0.0.0/0 state NEW
2632M 221G WEAVE-NPC-EGRESS-CUSTOM all -- * * 0.0.0.0/0 0.0.0.0/0 state NEW mark match ! 0x40000/0x40000
3645 282K NFLOG all -- * * 0.0.0.0/0 0.0.0.0/0 state NEW mark match ! 0x40000/0x40000 nflog-group 86
828K 34M DROP all -- * * 0.0.0.0/0 0.0.0.0/0 mark match ! 0x40000/0x40000
Chain WEAVE-NPC-EGRESS-ACCEPT (40 references)
pkts bytes target prot opt in out source destination
2642M 222G MARK all -- * * 0.0.0.0/0 0.0.0.0/0 MARK or 0x40000
Chain WEAVE-NPC-EGRESS-CUSTOM (1 references)
pkts bytes target prot opt in out source destination
267M 23G WEAVE-NPC-EGRESS-ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-URX+9;:x9ldNte:aX8vbA*5{v src /* pods: namespace: omd-worker, selector: -> anywhere (egress) */
267M 23G RETURN all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-URX+9;:x9ldNte:aX8vbA*5{v src /* pods: namespace: omd-worker, selector: -> anywhere (egress) */
417K 31M WEAVE-NPC-EGRESS-ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-Tdw0WN}DbSt6|lyO#J*~Zh#G( src /* pods: namespace: sakuli, selector: -> anywhere (egress) */
Chain WEAVE-NPC-EGRESS-DEFAULT (1 references)
pkts bytes target prot opt in out source destination
0 0 WEAVE-NPC-EGRESS-ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-41s)5vQ^o/xWGz6a20N:~?#|E src /* DefaultAllow egress isolation for namespace: kube-public */
0 0 RETURN all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-41s)5vQ^o/xWGz6a20N:~?#|E src /* DefaultAllow egress isolation for namespace: kube-public */
0 0 WEAVE-NPC-EGRESS-ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-nmMUaDKV*YkQcP5s?Q[R54Ep3 src /* DefaultAllow egress isolation for namespace: ingress-nginx */
0 0 RETURN all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-nmMUaDKV*YkQcP5s?Q[R54Ep3 src /* DefaultAllow egress isolation for namespace: ingress-nginx */
0 0 WEAVE-NPC-EGRESS-ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-E1ney4o[ojNrLk.6rOHi;7MPE src /* DefaultAllow egress isolation for namespace: kube-system */
0 0 RETURN all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-E1ney4o[ojNrLk.6rOHi;7MPE src /* DefaultAllow egress isolation for namespace: kube-system */
51279 3965K WEAVE-NPC-EGRESS-ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-s_+ChJId4Uy_$}G;WdH|~TK)I src /* DefaultAllow egress isolation for namespace: default */
51279 3965K RETURN all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-s_+ChJId4Uy_$}G;WdH|~TK)I src /* DefaultAllow egress isolation for namespace: default */
0 0 WEAVE-NPC-EGRESS-ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-sui%__gZ}{kX~oZgI_Ttqp=Dp src /* DefaultAllow egress isolation for namespace: kube-node-lease */
0 0 RETURN all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-sui%__gZ}{kX~oZgI_Ttqp=Dp src /* DefaultAllow egress isolation for namespace: kube-node-lease */
8895K 752M WEAVE-NPC-EGRESS-ACCEPT all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-DC2Bu*eUU]o+6eknJO^g1I$K8 src /* DefaultAllow egress isolation for namespace: omd-worker */
8895K 752M RETURN all -- * * 0.0.0.0/0 0.0.0.0/0 match-set weave-DC2Bu*eUU]o+6eknJO^g1I$K8 src /* DefaultAllow egress isolation for namespace: omd-worker */
What you expected to happen?
The pod should connect to the server who is not stand in the cluster
What happened?
Weave is blocking my Requests to server which are not in the cluster. Even if I delete all my networkPolicies, weave will still block this. The block is shown in the logs as follows: WARN: 2019/07/26 08:50:52.580450 UDP connection from 10.150.14.5:53734 to 10.32.1.172:161 blocked by Weave NPC.
How to reproduce it?
Install a clean kubernetes cluster and change the podSubnet and change the weave IP_ALLOC
weave: standard net.yaml with this following extra environment Variable
kubernetes set up by kubeadm with following extra arg in the yaml file
Anything else we need to know?
Installed a clean kubernetes cluster with kubeadm in Version 14.01 with a kubernetes yaml file extendes with the networking part from above and changed the weave environment with IPALLOC_RANGE
Versions:
Logs:
or, if using Kubernetes:
Nothing important. Only Delete: no addresses or remote MAC entries. $ journalctl -u docker.service --no-pager dockerd[26622]: time="2019-07-26T10:49:51.598808723+02:00" level=info msg="ignoring event" module=libcontainerd namespace=moby topic=/tasks/delete type="events.TaskDelete" $ journalctl -u kubelet --no-pager Jul 26 09:01:34 test-kubernetes kubelet[27132]: I0726 09:01:34.147545 27132 log.go:172] http: superfluous response.WriteHeader call from k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/httplog.(respLogger).WriteHeader (httplog.go:184) Jul 26 09:02:01 test-kubernetes kubelet[27132]: E0726 09:02:01.757297 27132 handler.go:302] Error writing response: http2: stream closed Jul 26 09:09:23 test-kubernetes kubelet[27132]: W0726 09:09:23.233362 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27837591 (27839459) Jul 26 09:13:12 test-kubernetes kubelet[27132]: W0726 09:13:12.449423 27132 reflector.go:289] object-"kube-system"/"kube-proxy": watch of v1.ConfigMap ended with: too old resource version: 27840319 (27840805) Jul 26 09:15:26 test-kubernetes kubelet[27132]: W0726 09:15:26.643177 27132 reflector.go:289] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:47: watch of v1.Pod ended with: too old resource version: 27838443 (27840574) Jul 26 09:26:23 test-kubernetes kubelet[27132]: W0726 09:26:23.241255 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27843671 (27845311) Jul 26 09:33:21 test-kubernetes kubelet[27132]: W0726 09:33:21.457869 27132 reflector.go:289] object-"kube-system"/"kube-proxy": watch of v1.ConfigMap ended with: too old resource version: 27845094 (27847739) Jul 26 09:39:25 test-kubernetes kubelet[27132]: W0726 09:39:25.653126 27132 reflector.go:289] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:47: watch of v1.Pod ended with: too old resource version: 27845874 (27848221) Jul 26 09:43:21 test-kubernetes kubelet[27132]: W0726 09:43:21.248482 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27849529 (27851021) Jul 26 09:51:11 test-kubernetes kubelet[27132]: W0726 09:51:11.464829 27132 reflector.go:289] object-"kube-system"/"kube-proxy": watch of v1.ConfigMap ended with: too old resource version: 27851833 (27853608) Jul 26 09:58:19 test-kubernetes kubelet[27132]: W0726 09:58:19.661707 27132 reflector.go:289] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:47: watch of v1.Pod ended with: too old resource version: 27853866 (27854717) Jul 26 09:59:25 test-kubernetes kubelet[27132]: W0726 09:59:25.255294 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27855149 (27856271) Jul 26 10:05:53 test-kubernetes kubelet[27132]: W0726 10:05:53.472682 27132 reflector.go:289] object-"kube-system"/"kube-proxy": watch of v1.ConfigMap ended with: too old resource version: 27857838 (27858585) Jul 26 10:14:03 test-kubernetes kubelet[27132]: W0726 10:14:03.263521 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27860566 (27861308) Jul 26 10:19:05 test-kubernetes kubelet[27132]: W0726 10:19:05.670888 27132 reflector.go:289] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:47: watch of v1.Pod ended with: too old resource version: 27860203 (27861816) Jul 26 10:25:43 test-kubernetes kubelet[27132]: W0726 10:25:43.482585 27132 reflector.go:289] object-"kube-system"/"kube-proxy": watch of v1.ConfigMap ended with: too old resource version: 27862775 (27865274) Jul 26 10:29:32 test-kubernetes kubelet[27132]: W0726 10:29:32.270565 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27865533 (27866509) Jul 26 10:37:29 test-kubernetes kubelet[27132]: W0726 10:37:29.678409 27132 reflector.go:289] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:47: watch of v1.Pod ended with: too old resource version: 27867224 (27868319) Jul 26 10:42:39 test-kubernetes kubelet[27132]: W0726 10:42:39.279341 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27870818 (27871101) Jul 26 10:43:22 test-kubernetes kubelet[27132]: W0726 10:43:22.489597 27132 reflector.go:289] object-"kube-system"/"kube-proxy": watch of v1.ConfigMap ended with: too old resource version: 27869524 (27871349) Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.264124 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/memory/libcontainer_24617_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/memory/libcontainer_24617_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.264407 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_24617_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_24617_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.364445 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/memory/libcontainer_24640_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/memory/libcontainer_24640_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.364497 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_24640_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_24640_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.390279 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/memory/libcontainer_24646_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/memory/libcontainer_24646_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.390332 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_24646_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_24646_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.393679 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_24646_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_24646_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.441005 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/blkio/libcontainer_24661_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/blkio/libcontainer_24661_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.441307 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/memory/libcontainer_24661_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/memory/libcontainer_24661_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.441519 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_24661_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_24661_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.441735 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/cpu,cpuacct/libcontainer_24661_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/cpu,cpuacct/libcontainer_24661_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.441919 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/blkio/libcontainer_24661_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/blkio/libcontainer_24661_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.442089 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/memory/libcontainer_24661_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/memory/libcontainer_24661_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.442311 27132 container.go:409] Failed to create summary reader for "/libcontainer_24640_systemd_test_default.slice": none of the resources are being tracked. Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.442326 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_24661_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_24661_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.442884 27132 container.go:409] Failed to create summary reader for "/libcontainer_24640_systemd_test_default.slice": none of the resources are being tracked. Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.443177 27132 container.go:409] Failed to create summary reader for "/libcontainer_24646_systemd_test_default.slice": none of the resources are being tracked. Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.443491 27132 container.go:409] Failed to create summary reader for "/libcontainer_24646_systemd_test_default.slice": none of the resources are being tracked. Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.443798 27132 container.go:409] Failed to create summary reader for "/libcontainer_24661_systemd_test_default.slice": none of the resources are being tracked. Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.476378 27132 container.go:523] Failed to update stats for container "/libcontainer_24676_systemd_test_default.slice": open /sys/fs/cgroup/memory/libcontainer_24676_systemd_test_default.slice/memory.use_hierarchy: no such file or directory, continuing to push stats Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.476514 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/memory/libcontainer_24676_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/memory/libcontainer_24676_systemd_test_default.slice: no such file or directory Jul 26 10:49:51 test-kubernetes kubelet[27132]: W0726 10:49:51.476549 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_24676_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_24676_systemd_test_default.slice: no such file or directory Jul 26 10:53:01 test-kubernetes kubelet[27132]: E0726 10:53:01.853036 27132 handler.go:302] Error writing response: http2: stream closed Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.690342 27132 container.go:409] Failed to create summary reader for "/libcontainer_32795_systemd_test_default.slice": none of the resources are being tracked. Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.690475 27132 container.go:409] Failed to create summary reader for "/libcontainer_32795_systemd_test_default.slice": none of the resources are being tracked. Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.690519 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/blkio/libcontainer_32814_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/blkio/libcontainer_32814_systemd_test_default.slice: no such file or directory Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.690572 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/memory/libcontainer_32814_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/memory/libcontainer_32814_systemd_test_default.slice: no such file or directory Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.690592 27132 container.go:409] Failed to create summary reader for "/libcontainer_32801_systemd_test_default.slice": none of the resources are being tracked. Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.690594 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_32814_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_32814_systemd_test_default.slice: no such file or directory Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.690632 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/cpu,cpuacct/libcontainer_32814_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/cpu,cpuacct/libcontainer_32814_systemd_test_default.slice: no such file or directory Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.690645 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/blkio/libcontainer_32814_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/blkio/libcontainer_32814_systemd_test_default.slice: no such file or directory Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.690658 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/memory/libcontainer_32814_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/memory/libcontainer_32814_systemd_test_default.slice: no such file or directory Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.690668 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_32814_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_32814_systemd_test_default.slice: no such file or directory Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.691844 27132 container.go:409] Failed to create summary reader for "/libcontainer_32801_systemd_test_default.slice": none of the resources are being tracked. Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.691957 27132 container.go:409] Failed to create summary reader for "/libcontainer_32814_systemd_test_default.slice": none of the resources are being tracked. Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.876194 27132 container.go:523] Failed to update stats for container "/libcontainer_32847_systemd_test_default.slice": read /sys/fs/cgroup/cpu,cpuacct/libcontainer_32847_systemd_test_default.slice/cpuacct.stat: no such device, continuing to push stats Jul 26 10:57:02 test-kubernetes kubelet[27132]: W0726 10:57:02.876662 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_32847_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_32847_systemd_test_default.slice: no such file or directory Jul 26 10:58:58 test-kubernetes kubelet[27132]: W0726 10:58:58.497599 27132 reflector.go:289] object-"kube-system"/"kube-proxy": watch of v1.ConfigMap ended with: too old resource version: 27875564 (27876579) Jul 26 10:59:26 test-kubernetes kubelet[27132]: W0726 10:59:26.688055 27132 reflector.go:289] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:47: watch of v1.Pod ended with: too old resource version: 27873684 (27875509) Jul 26 10:59:41 test-kubernetes kubelet[27132]: W0726 10:59:41.285944 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27875339 (27876889) Jul 26 11:14:14 test-kubernetes kubelet[27132]: W0726 11:14:14.505202 27132 reflector.go:289] object-"kube-system"/"kube-proxy": watch of v1.ConfigMap ended with: too old resource version: 27880772 (27881711) Jul 26 11:16:10 test-kubernetes kubelet[27132]: W0726 11:16:10.699339 27132 reflector.go:289] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:47: watch of v1.Pod ended with: too old resource version: 27880952 (27881198) Jul 26 11:17:25 test-kubernetes kubelet[27132]: W0726 11:17:25.292297 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27881004 (27882777) Jul 26 11:31:28 test-kubernetes kubelet[27132]: W0726 11:31:28.511127 27132 reflector.go:289] object-"kube-system"/"kube-proxy": watch of v1.ConfigMap ended with: too old resource version: 27885896 (27887503) Jul 26 11:33:06 test-kubernetes kubelet[27132]: W0726 11:33:06.706490 27132 reflector.go:289] k8s.io/kubernetes/pkg/kubelet/config/apiserver.go:47: watch of v1.Pod ended with: too old resource version: 27886578 (27887055) Jul 26 11:34:18 test-kubernetes kubelet[27132]: W0726 11:34:18.299295 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27887090 (27888507) Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.929869 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_81942_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_81942_systemd_test_default.slice: no such file or directory Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.993748 27132 container.go:409] Failed to create summary reader for "/libcontainer_81936_systemd_test_default.slice": none of the resources are being tracked. Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.993859 27132 container.go:409] Failed to create summary reader for "/libcontainer_81936_systemd_test_default.slice": none of the resources are being tracked. Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.993960 27132 container.go:409] Failed to create summary reader for "/libcontainer_81942_systemd_test_default.slice": none of the resources are being tracked. Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.994042 27132 container.go:409] Failed to create summary reader for "/libcontainer_81942_systemd_test_default.slice": none of the resources are being tracked. Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.994166 27132 container.go:409] Failed to create summary reader for "/libcontainer_81956_systemd_test_default.slice": none of the resources are being tracked. Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.994523 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/blkio/libcontainer_81956_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/blkio/libcontainer_81956_systemd_test_default.slice: no such file or directory Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.994579 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/memory/libcontainer_81956_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/memory/libcontainer_81956_systemd_test_default.slice: no such file or directory Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.994611 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_81956_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_81956_systemd_test_default.slice: no such file or directory Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.994651 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/cpu,cpuacct/libcontainer_81956_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/cpu,cpuacct/libcontainer_81956_systemd_test_default.slice: no such file or directory Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.994668 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/blkio/libcontainer_81956_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/blkio/libcontainer_81956_systemd_test_default.slice: no such file or directory Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.994684 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/memory/libcontainer_81956_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/memory/libcontainer_81956_systemd_test_default.slice: no such file or directory Jul 26 11:43:47 test-kubernetes kubelet[27132]: W0726 11:43:47.994695 27132 raw.go:87] Error while processing event ("/sys/fs/cgroup/devices/libcontainer_81956_systemd_test_default.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/devices/libcontainer_81956_systemd_test_default.slice: no such file or directory Jul 26 11:43:48 test-kubernetes kubelet[27132]: W0726 11:43:48.105815 27132 container.go:523] Failed to update stats for container "/libcontainer_81986_systemd_test_default.slice": failed to parse memory.usage_in_bytes - open /sys/fs/cgroup/memory/libcontainer_81986_systemd_test_default.slice/memory.usage_in_bytes: no such file or directory, continuing to push stats Jul 26 11:43:48 test-kubernetes kubelet[27132]: W0726 11:43:48.139868 27132 helpers.go:137] readString: Failed to read "/sys/fs/cgroup/memory/libcontainer_81992_systemd_test_default.slice/memory.limit_in_bytes": read /sys/fs/cgroup/memory/libcontainer_81992_systemd_test_default.slice/memory.limit_in_bytes: no such device Jul 26 11:43:48 test-kubernetes kubelet[27132]: W0726 11:43:48.144078 27132 container.go:422] Failed to get RecentStats("/libcontainer_81992_systemd_test_default.slice") while determining the next housekeeping: unable to find data in memory cache Jul 26 11:48:33 test-kubernetes kubelet[27132]: W0726 11:48:33.307027 27132 reflector.go:289] object-"kube-system"/"coredns": watch of v1.ConfigMap ended with: too old resource version: 27892735 (27893294) Jul 26 11:49:23 test-kubernetes kubelet[27132]: W0726 11:49:23.517903 27132 reflector.go:289] object-"kube-system"/"kube-proxy": watch of *v1.ConfigMap ended with: too old resource version: 27891809 (27893561)
$ kubectl get events LAST SEEN TYPE REASON OBJECT MESSAGE 55m Normal CREATE ingress/my-nginx-ingress-a Ingress default/my-nginx-ingress-a 45m Normal CREATE ingress/my-nginx-ingress-a Ingress default/my-nginx-ingress-a 31m Normal CREATE ingress/my-nginx-ingress-a Ingress default/my-nginx-ingress-a 18m Normal CREATE ingress/my-nginx-ingress-a Ingress default/my-nginx-ingress-a 55m Normal CREATE ingress/my-nginx-ingress-b Ingress default/my-nginx-ingress-b 45m Normal CREATE ingress/my-nginx-ingress-b Ingress default/my-nginx-ingress-b 31m Normal CREATE ingress/my-nginx-ingress-b Ingress default/my-nginx-ingress-b 18m Normal CREATE ingress/my-nginx-ingress-b Ingress default/my-nginx-ingress-b
-->