Closed c5haw closed 4 years ago
<SERVICE_NAME>.<NAMESPACE>.svc
is correct name, and kubedns
will correctly point to the right address without specifically add .cluster.local
. The issue that you encounter is related with the inability of GKE control-plane to do webhook admission directly into pod port because of firewall rule. In tekton-pipelines
case, control-plane need to directly access 8443
port. So, you need to setup firewall accordingly [1].
EDIT: This is the example to update the firewall rules for this case
gcloud compute firewall-rules update <FIREWALL_NAME> --allow <OTHER_PORT>,tcp:8443
@fx2y is correct I had same issue look for your clusters master's ingress firewall rule. In the cloud console it looks like this.
gke-my-cluster-13e19556-master | Ingress | gke-my-cluster-13e19556-node | IP ranges: 172.16.X.X/28 | tcp:10250,443,15017,8443 | Allow | 1000 | default | Off
add 8443
That's brilliant. Thanks for that. I suspected something like this was the case, but could not work out where the firewall rule needed updating.
Thanks. @scott-kawaguchi below i add more detailed info how to fix in case anyone also meets the problem.
firewall rule name | Ingress | <network tag for node tekton pods reside> | IP ranges: <master IP range> | tcp:10250,443,15017,8443 | Allow | 1000 | default | Off
Expected Behavior
Create a new ClusterTask and TaskRun
Actual Behavior
Steps to Reproduce the Problem
Pre-requisites
Steps
Install Tekton
kubectl apply --filename https://storage.googleapis.com/tekton-releases/pipeline/latest/release.yaml
Create Kustomization template and resources
base/kustomization.yaml
base/task.yaml
goroutine 6 [chan receive]: k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).flushDaemon(0x2d11ca0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1131 +0x8b created by k8s.io/kubernetes/vendor/k8s.io/klog/v2.init.0 /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:416 +0xd8
goroutine 8 [select]: k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x1d05a80, 0x1e5a0c0, 0xc000592540, 0x1, 0xc00007c0c0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x149 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x1d05a80, 0x12a05f200, 0x0, 0x1, 0xc00007c0c0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0x98 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(0x1d05a80, 0x12a05f200, 0xc00007c0c0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90 +0x4d created by k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/util/logs.InitLogs /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/util/logs/logs.go:51 +0x96
goroutine 14 [IO wait]: internal/poll.runtime_pollWait(0x7fdcdfbb9c78, 0x72, 0x1e5d2a0) /usr/local/go/src/runtime/netpoll.go:220 +0x55 internal/poll.(pollDesc).wait(0xc00069a118, 0x72, 0xc000457b00, 0x8f7, 0x8f7) /usr/local/go/src/internal/poll/fd_poll_runtime.go:87 +0x45 internal/poll.(pollDesc).waitRead(...) /usr/local/go/src/internal/poll/fd_poll_runtime.go:92 internal/poll.(FD).Read(0xc00069a100, 0xc000457b00, 0x8f7, 0x8f7, 0x0, 0x0, 0x0) /usr/local/go/src/internal/poll/fd_unix.go:159 +0x1b1 net.(netFD).Read(0xc00069a100, 0xc000457b00, 0x8f7, 0x8f7, 0x203000, 0x6524db, 0xc00068e4e0) /usr/local/go/src/net/fd_posix.go:55 +0x4f net.(conn).Read(0xc0002a0058, 0xc000457b00, 0x8f7, 0x8f7, 0x0, 0x0, 0x0) /usr/local/go/src/net/net.go:182 +0x8e crypto/tls.(atLeastReader).Read(0xc000493740, 0xc000457b00, 0x8f7, 0x8f7, 0x21f, 0x89a, 0xc00060d710) /usr/local/go/src/crypto/tls/conn.go:779 +0x62 bytes.(Buffer).ReadFrom(0xc00068e600, 0x1e58ac0, 0xc000493740, 0x40b605, 0x1a18300, 0x1b94480) /usr/local/go/src/bytes/buffer.go:204 +0xb1 crypto/tls.(Conn).readFromUntil(0xc00068e380, 0x1e5b600, 0xc0002a0058, 0x5, 0xc0002a0058, 0x20e) /usr/local/go/src/crypto/tls/conn.go:801 +0xf3 crypto/tls.(Conn).readRecordOrCCS(0xc00068e380, 0x0, 0x0, 0xc00060dd18) /usr/local/go/src/crypto/tls/conn.go:608 +0x115 crypto/tls.(Conn).readRecord(...) /usr/local/go/src/crypto/tls/conn.go:576 crypto/tls.(Conn).Read(0xc00068e380, 0xc000336000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /usr/local/go/src/crypto/tls/conn.go:1252 +0x15f bufio.(Reader).Read(0xc000129620, 0xc00075e2d8, 0x9, 0x9, 0xc00060dd18, 0x1d06800, 0x97d8cb) /usr/local/go/src/bufio/bufio.go:227 +0x222 io.ReadAtLeast(0x1e588e0, 0xc000129620, 0xc00075e2d8, 0x9, 0x9, 0x9, 0xc00006e060, 0x0, 0x1e58ce0) /usr/local/go/src/io/io.go:314 +0x87 io.ReadFull(...) /usr/local/go/src/io/io.go:333 k8s.io/kubernetes/vendor/golang.org/x/net/http2.readFrameHeader(0xc00075e2d8, 0x9, 0x9, 0x1e588e0, 0xc000129620, 0x0, 0x0, 0xc0041627b0, 0x0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/frame.go:237 +0x89 k8s.io/kubernetes/vendor/golang.org/x/net/http2.(Framer).ReadFrame(0xc00075e2a0, 0xc0041627b0, 0x0, 0x0, 0x0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/frame.go:492 +0xa5 k8s.io/kubernetes/vendor/golang.org/x/net/http2.(clientConnReadLoop).run(0xc00060dfa8, 0x0, 0x0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/transport.go:1794 +0xd8 k8s.io/kubernetes/vendor/golang.org/x/net/http2.(ClientConn).readLoop(0xc0002f5680) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/transport.go:1716 +0x6f created by k8s.io/kubernetes/vendor/golang.org/x/net/http2.(Transport).newClientConn /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/transport.go:695 +0x66e
goroutine 67 [runnable]: k8s.io/kubernetes/vendor/golang.org/x/net/http2.(clientStream).awaitRequestCancel(0xc004164840, 0xc00054eb00) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/transport.go:334 created by k8s.io/kubernetes/vendor/golang.org/x/net/http2.(clientConnReadLoop).handleResponse /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/transport.go:2029 +0x768
Client Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.2", GitCommit:"f5743093fd1c663cb0cbc89748f730662345d44d", GitTreeState:"clean", BuildDate:"2020-09-16T13:41:02Z", GoVersion:"go1.15", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"16+", GitVersion:"v1.16.13-gke.401", GitCommit:"eb94c181eea5290e9da1238db02cfef263542f5f", GitTreeState:"clean", BuildDate:"2020-09-09T00:57:35Z", GoVersion:"go1.13.9b4", Compiler:"gc", Platform:"linux/amd64"}
v0.16.3
kubectl -n kube-system get all NAME READY STATUS RESTARTS AGE pod/event-exporter-gke-8489df9489-n9g2x 2/2 Running 0 20d pod/fluentd-gke-8pjpw 2/2 Running 1 42d pod/fluentd-gke-q6wpb 2/2 Running 2 42d pod/fluentd-gke-scaler-cd4d654d7-9jnfq 1/1 Running 0 115d pod/fluentd-gke-tqbdw 2/2 Running 1 42d pod/fluentd-gke-tsjqw 2/2 Running 5 42d pod/fluentd-gke-wmtfn 2/2 Running 1 42d pod/gke-metrics-agent-5plf2 1/1 Running 0 19d pod/gke-metrics-agent-7rn29 1/1 Running 0 19d pod/gke-metrics-agent-crq2q 1/1 Running 0 19d pod/gke-metrics-agent-d7bwl 1/1 Running 0 19d pod/gke-metrics-agent-vxspp 1/1 Running 0 19d pod/kube-dns-7c976ddbdb-bss7s 4/4 Running 0 40d pod/kube-dns-7c976ddbdb-k2hgl 4/4 Running 0 40d pod/kube-dns-autoscaler-645f7d66cf-dlggx 1/1 Running 0 115d pod/kube-proxy-gke-sauce-dev-5822cd9e-h1zr 1/1 Running 0 115d pod/kube-proxy-gke-sauce-dev-65b3c7bd-7767 1/1 Running 0 115d pod/kube-proxy-gke-sauce-dev-65b3c7bd-bk6h 1/1 Running 0 106d pod/kube-proxy-gke-sauce-dev-9fd5ef58-1mg7 1/1 Running 0 105d pod/kube-proxy-gke-sauce-dev-9fd5ef58-76kx 1/1 Running 0 115d pod/l7-default-backend-678889f899-76zdw 1/1 Running 0 115d pod/metrics-server-v0.3.6-64655c969-lpj5n 2/2 Running 0 20d pod/prometheus-to-sd-bzk9f 1/1 Running 0 19d pod/prometheus-to-sd-c864t 1/1 Running 0 19d pod/prometheus-to-sd-hbj6r 1/1 Running 0 19d pod/prometheus-to-sd-l4fx9 1/1 Running 0 19d pod/prometheus-to-sd-qfx78 1/1 Running 0 19d pod/stackdriver-metadata-agent-cluster-level-5d7556ff66-7mkbk 2/2 Running 0 40d
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE service/default-http-backend NodePort 10.20.0.40 80:32765/TCP 115d
service/kube-dns ClusterIP 10.20.0.10 53/UDP,53/TCP 115d
service/metrics-server ClusterIP 10.20.0.192 443/TCP 115d
NAME DESIRED CURRENT READY UP-TO-DATE AVAILABLE NODE SELECTOR AGE daemonset.apps/fluentd-gke 5 5 5 5 5 beta.kubernetes.io/os=linux 115d daemonset.apps/gke-metrics-agent 5 5 5 5 5 kubernetes.io/os=linux 115d daemonset.apps/gke-metrics-agent-windows 0 0 0 0 0 kubernetes.io/os=windows 115d daemonset.apps/kube-proxy 0 0 0 0 0 beta.kubernetes.io/os=linux,node.kubernetes.io/kube-proxy-ds-ready=true 115d daemonset.apps/metadata-proxy-v0.1 0 0 0 0 0 beta.kubernetes.io/os=linux,cloud.google.com/metadata-proxy-ready=true 115d daemonset.apps/nvidia-gpu-device-plugin 0 0 0 0 0 115d
daemonset.apps/prometheus-to-sd 5 5 5 5 5 beta.kubernetes.io/os=linux 115d
NAME READY UP-TO-DATE AVAILABLE AGE deployment.apps/event-exporter-gke 1/1 1 1 115d deployment.apps/fluentd-gke-scaler 1/1 1 1 115d deployment.apps/kube-dns 2/2 2 2 115d deployment.apps/kube-dns-autoscaler 1/1 1 1 115d deployment.apps/l7-default-backend 1/1 1 1 115d deployment.apps/metrics-server-v0.3.6 1/1 1 1 115d deployment.apps/stackdriver-metadata-agent-cluster-level 1/1 1 1 115d
NAME DESIRED CURRENT READY AGE replicaset.apps/event-exporter-gke-59b99fdd9c 0 0 0 65d replicaset.apps/event-exporter-gke-6c56555957 0 0 0 115d replicaset.apps/event-exporter-gke-6c9d8bd8d8 0 0 0 79d replicaset.apps/event-exporter-gke-8489df9489 1 1 1 20d replicaset.apps/fluentd-gke-scaler-cd4d654d7 1 1 1 115d replicaset.apps/kube-dns-56d8cd994f 0 0 0 79d replicaset.apps/kube-dns-5c9ff9fc54 0 0 0 115d replicaset.apps/kube-dns-7c976ddbdb 2 2 2 42d replicaset.apps/kube-dns-autoscaler-645f7d66cf 1 1 1 115d replicaset.apps/l7-default-backend-678889f899 1 1 1 115d replicaset.apps/metrics-server-v0.3.6-64655c969 1 1 1 20d replicaset.apps/metrics-server-v0.3.6-7b7d6c7576 0 0 0 115d replicaset.apps/metrics-server-v0.3.6-7dbf8647f6 0 0 0 115d replicaset.apps/stackdriver-metadata-agent-cluster-level-55bf76579c 0 0 0 115d replicaset.apps/stackdriver-metadata-agent-cluster-level-5d7556ff66 1 1 1 42d replicaset.apps/stackdriver-metadata-agent-cluster-level-5d7c86856c 0 0 0 65d replicaset.apps/stackdriver-metadata-agent-cluster-level-75579c648d 0 0 0 79d replicaset.apps/stackdriver-metadata-agent-cluster-level-78f6bd5c76 0 0 0 79d replicaset.apps/stackdriver-metadata-agent-cluster-level-854694cdd5 0 0 0 115d