tektoncd / pipeline

A cloud-native Pipeline resource.
https://tekton.dev
Apache License 2.0
8.51k stars 1.78k forks source link

Internal error occured: context deadline exceeded #3317

Closed c5haw closed 4 years ago

c5haw commented 4 years ago

Expected Behavior

Create a new ClusterTask and TaskRun

Actual Behavior

Internal error occurred: failed calling webhook "webhook.pipeline.tekton.dev": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded

Steps to Reproduce the Problem

Pre-requisites

Steps

  1. Install Tekton kubectl apply --filename https://storage.googleapis.com/tekton-releases/pipeline/latest/release.yaml

  2. Create Kustomization template and resources

base/kustomization.yaml

apiVersion: kustomize.config.k8s.io/v1beta1
kind: Kustomization
namespace: tekton-pipelines
resources:
  - task.yaml
generatorOptions:
  disableNameSuffixHash: true
configMapGenerator:
  - name: config-artifact-pvc
    literals:
      - size=5Gi
      - storageClassName=standard

base/task.yaml

apiVersion: tekton.dev/v1beta1
kind: ClusterTask
metadata:
  name: git-clone
spec:
  workspaces:
    - name: shared-workspace
  params:
    - name: url
      type: string
      description: the url to the project git repository to clone
    - name: subdirectory
      type: string
      description: the subdirectory within the workspace to clone the project within
      default: /
    - name: revision
      type: string
      description: the revision/branch within the project to checkout
  stepTemplate:
    image: alpine/git
  steps:
    - name: git-clone
      args:
        - clone $(params.url) $(workspaces.shared-workspace.path)$(params.subdirectory)
    - name: git-checkout
      args:
        - checkout $(params.revision)

---

apiVersion: tekton.dev/v1beta1
kind: TaskRun
metadata:
  name: checkout
spec:
  taskRef:
    name: git-clone
    kind: ClusterTask
  params:
    - name: url
      value: git@github.com:sauce-consortia/zookeeper.git
    - name: revision
      value: master
  workspaces:
    - name: shared-workspace
      emptyDir: {}
  1. Apply kustomization
    
    kustomize build base | kubectl apply -f - -v=9
    I1001 16:07:07.361864   14169 loader.go:375] Config loaded from file:  /home/sa_103893306338111802720/.kube/config
    I1001 16:07:07.363068   14169 round_trippers.go:423] curl -k -v -XGET  -H "Accept: application/com.github.proto-openapi.spec.v2@v1.0+protobuf" -H "User-Agent: kubectl/v1.19.2 (linux/amd64) kubernetes/f574309" 'https://34.89.39.224/openapi/v2?timeout=32s'
    I1001 16:07:07.383701   14169 round_trippers.go:443] GET https://34.89.39.224/openapi/v2?timeout=32s 200 OK in 20 milliseconds
    I1001 16:07:07.383720   14169 round_trippers.go:449] Response Headers:
    I1001 16:07:07.383725   14169 round_trippers.go:452]     Cache-Control: no-cache, private
    I1001 16:07:07.383729   14169 round_trippers.go:452]     Content-Type: application/octet-stream
    I1001 16:07:07.383734   14169 round_trippers.go:452]     Date: Thu, 01 Oct 2020 16:07:07 GMT
    I1001 16:07:07.383738   14169 round_trippers.go:452]     Etag: "D69398A3A2BD7EBDCF2FF775A161C375F43404286653DBA2BF05BFC7EF7C1B8F8ED29D3D7A9DB69C56A9CCE7255975755B035D4482A83F968450AB31FE26E1B0"
    I1001 16:07:07.383743   14169 round_trippers.go:452]     X-Varied-Accept: application/com.github.proto-openapi.spec.v2@v1.0+protobuf
    I1001 16:07:07.383747   14169 round_trippers.go:452]     Audit-Id: de107b97-14fb-47a5-a399-eeb7fcec1186
    I1001 16:07:07.383751   14169 round_trippers.go:452]     Last-Modified: Thu, 01 Oct 2020 14:03:54 GMT
    I1001 16:07:07.383755   14169 round_trippers.go:452]     Vary: Accept-Encoding
    I1001 16:07:07.383759   14169 round_trippers.go:452]     Vary: Accept
    I1001 16:07:07.383763   14169 round_trippers.go:452]     X-From-Cache: 1
    I1001 16:07:07.383767   14169 round_trippers.go:452]     Accept-Ranges: bytes
    I1001 16:07:07.526830   14169 request.go:1095] Response Body:
    00000000  0a 03 32 2e 30 12 16 0a  0a 4b 75 62 65 72 6e 65  |..2.0....Kuberne|
    00000010  74 65 73 12 08 76 31 2e  31 36 2e 31 33 42 89 8d  |tes..v1.16.13B..|
    00000020  a3 01 12 98 03 0a 28 2f  61 70 69 73 2f 72 62 61  |......(/apis/rba|
    00000030  63 2e 61 75 74 68 6f 72  69 7a 61 74 69 6f 6e 2e  |c.authorization.|
    00000040  6b 38 73 2e 69 6f 2f 76  31 62 65 74 61 31 2f 12  |k8s.io/v1beta1/.|
    00000050  eb 02 12 e8 02 0a 19 72  62 61 63 41 75 74 68 6f  |.......rbacAutho|
    00000060  72 69 7a 61 74 69 6f 6e  5f 76 31 62 65 74 61 31  |rization_v1beta1|
    00000070  1a 17 67 65 74 20 61 76  61 69 6c 61 62 6c 65 20  |..get available |
    00000080  72 65 73 6f 75 72 63 65  73 2a 27 67 65 74 52 62  |resources*'getRb|
    00000090  61 63 41 75 74 68 6f 72  69 7a 61 74 69 6f 6e 56  |acAuthorizationV|
    000000a0  31 62 65 74 61 31 41 50  49 52 65 73 6f 75 72 63  |1beta1APIResourc|
    000000b0  65 73 32 10 61 70 70 6c  69 63 61 74 69 6f 6e 2f  |es2.application/|
    000000c0  6a 73 6f 6e 32 10 61 70  70 6c 69 63 61 74 69 6f  |json2.applicatio|
    000000d0  6e 2f 79 61 6d 6c 32 23  61 70 70 6c 69 63 61 74  |n/yaml2#applicat|
    000000e0  69 6f 6e 2f 76 6e 64 2e  6b 75 62 65 72 6e 65 74  |ion/vnd.kubernet|
    000000f0  65 73 2e 70 72 6f 74 6f  62 75 66 3a 10 61 70 70  |es.protobuf:.app|
    00000100  6c 69 63 61 74 69 6f 6e  2f 6a 73 6f 6e 3a 10 61  |lication/json:.a|
    00000110  70 70 6c 69 63 61 74 69  6f 6e 2f 79 61 6d 6c 3a  |pplication/yaml:|
    00000120  23 61 70 70 6c 69 63 61  74 69 6f 6e 2f 76 6e 64  |#application/vnd|
    00000130  2e 6b 75 62 65 72 6e 65  74 65 73 2e 70 72 6f 74  |.kubernetes.prot|
    00000140  6f 62 75 66 4a 70 0a 55  0a 03 32 30 30 12 4e 0a  |obufJp.U..200.N.|
    00000150  4c 0a 02 4f 4b 12 46 0a  44 0a 42 23 2f 64 65 66  |L..OK.F.D.B#/def|
    00000160  69 6e 69 74 69 6f 6e 73  2f 69 6f 2e 6b 38 73 2e  |initions/io.k8s.|
    00000170  61 70 69 6d 61 63 68 69  6e 65 72 79 2e 70 6b 67  |apimachinery.pkg|
    00000180  2e 61 70 69 73 2e 6d 65  74 61 2e 76 31 2e 41 50  |.apis.meta.v1.AP|
    00000190  49 52 65 73 6f 75 72 63  65 4c 69 73 74 0a 17 0a  |IResourceList...|
    000001a0  03 34 30 31 12 10 0a 0e  0a 0c 55 6e 61 75 74 68  |.401......Unauth|
    000001b0  6f 72 69 7a 65 64 52 05  68 74 74 70 73 12 9b 27  |orizedR.https..'|
    000001c0  0a 31 2f 61 70 69 73 2f  73 74 6f 72 61 67 65 2e  |.1/apis/storage.|
    000001d0  6b 38 73 2e 69 6f 2f 76  31 62 65 74 61 31 2f 77  |k8s.io/v1beta1/w|
    000001e0  61 74 63 68 2f 73 74 6f  72 61 67 65 63 6c 61 73  |atch/storageclas|
    000001f0  73 65 73 12 e5 26 12 c9  04 0a 0f 73 74 6f 72 61  |ses..&.....stora|
    00000200  67 65 5f 76 31 62 65 74  61 31 1a 78 77 61 74 63  |ge_v1beta1.xwatc|
    00000210  68 20 69 6e 64 69 76 69  64 75 61 6c 20 63 68 61  |h individual cha|
    00000220  6e 67 65 73 20 74 6f 20  61 20 6c 69 73 74 20 6f  |nges to a list o|
    00000230  66 20 53 74 6f 72 61 67  65 43 6c 61 73 73 2e 20  |f StorageClass. |
    00000240  64 65 70 72 65 63 61 74  65 64 3a 20 75 73 65 20  |deprecated: use |
    00000250  74 68 65 20 27 77 61 74  63 68 27 20 70 61 72 61  |the 'watch' para|
    00000260  6d 65 74 65 72 20 77 69  74 68 20 61 20 6c 69 73  |meter with a lis|
    00000270  74 20 6f 70 65 72 61 74  69 6f 6e 20 69 6e 73 74  |t operation inst|
    00000280  65 61 64 2e 2a 23 77 61  74 63 68 53 74 6f 72 61  |ead.*#watchStora|
    00000290  67 65 56 31 62 65 74 61  31 53 74 6f 72 61 67 65  |geV1beta1Storage|
    000002a0  43 6c 61 73 73 4c 69 73  74 32 10 61 70 70 6c 69  |ClassList2.appli|
    000002b0  63 61 74 69 6f 6e 2f 6a  73 6f 6e 32 10 61 70 70  |cation/json2.app|
    000002c0  6c 69 63 61 74 69 6f 6e  2f 79 61 6d 6c 32 23 61  |lication/yaml2#a|
    000002d0  70 70 6c 69 63 61 74 69  6f 6e 2f 76 6e 64 2e 6b  |pplication/vnd.k|
    000002e0  75 62 65 72 6e 65 74 65  73 2e 70 72 6f 74 6f 62  |ubernetes.protob|
    000002f0  75 66 32 1d 61 70 70 6c  69 63 61 74 69 6f 6e 2f  |uf2.application/|
    00000300  6a 73 6f 6e 3b 73 74 72  65 61 6d 3d 77 61 74 63  |json;stream=watc|
    00000310  68 32 30 61 70 70 6c 69  63 61 74 69 6f 6e 2f 76  |h20application/v|
    00000320  6e 64 2e 6b 75 62 65 72  6e 65 74 65 73 2e 70 72  |nd.kubernetes.pr|
    00000330  6f 74 6f 62 75 66 3b 73  74 72 65 61 6d 3d 77 61  |otobuf;stream=wa|
    00000340  74 63 68 3a 03 2a 2f 2a  4a 6b 0a 50 0a 03 32 30  |tch:.*/*Jk.P..20|
    00000350  30 12 49 0a 47 0a 02 4f  4b 12 41 0a 3f 0a 3d 23  |0.I.G..OK.A.?.=#|
    00000360  2f 64 65 66 69 6e 69 74  69 6f 6e 73 2f 69 6f 2e  |/definitions/io.|
    00000370  6b 38 73 2e 61 70 69 6d  61 63 68 69 6e 65 72 79  |k8s.apimachinery|
    00000380  2e 70 6b 67 2e 61 70 69  73 2e 6d 65 74 61 2e 76  |.pkg.apis.meta.v|
    00000390  31 2e 57 61 74 63 68 45  76 65 6e 74 0a 17 0a 03  |1.WatchEvent....|
    000003a0  34 30 31 12 10 0a 0e 0a  0c 55 6e 61 75 74 68 6f  |401......Unautho|
    000003b0  72 69 7a 65 64 52 05 68  74 74 70 73 6a 5f 0a 1f  |rizedR.httpsj_..|
    000003c0  78 2d 6b 75 62 65 72 6e  65 74 65 73 2d 67 72 6f  |x-kubernetes-gro|
    000003d0  75 70 2d 76 65 72 73 69  6f 6e 2d 6b 69 6e 64 12  |up-version-kind.|
    000003e0  3c 12 3a 6b 69 6e 64 3a  20 53 74 6f 72 61 67 65  |<.:kind: Storage|
    000003f0  43 6c 61 73 73 0a 76 65  72 73 69 6f 6e 3a 20 76  |Class.version: v|
    00000400  31 62 65 74 61 31 0a 67  72 6f 75 70 3a 20 73 74  |1beta1.group: st|
    00000410  6f 72 61 67 65 2e 6b 38  73 2e 69 6f 0a 6a 23 0a  |orage.k8s.io.j#.|
    00000420  13 78 2d 6b 75 62 65 72  6e 65 74 65 73 2d 61 63  |.x-kubernetes-ac|
    00000430  74 69 6f 6e 12 0c 12 0a  77 61 74 63 68 6c 69 73  |tion....watchlis|
    00000440  74 0a 4a 97 04 0a 94 04  12 91 04 1a 8e 04 12 05  |t.J.............|
    00000450  71 75 65 72 79 1a e3 03  61 6c 6c 6f 77 57 61 74  |query...allowWat|
    00000460  63 68 42 6f 6f 6b 6d 61  72 6b 73 20 72 65 71 75  |chBookmarks requ|
    00000470  65 73 74 73 20 77 61 74  63 68 20 65 76 65 6e 74  |ests watch event|
    00000480  73 20 77 69 74 68 20 74  79 70 65 20 22 42 4f 4f  |s with type "BOO|
    00000490  4b 4d 41 52 4b 22 2e 20  53 65 72 76 65 72 73 20  |KMARK". Servers |
    000004a0  74 68 61 74 20 64 6f 20  6e 6f 74 20 69 6d 70 6c  |that do not impl|
    000004b0  65 6d 65 6e 74 20 62 6f  6f 6b 6d 61 72 6b 73 20  |ement bookmarks |
    000004c0  6d 61 79 20 69 67 6e 6f  72 65 20 74 68 69 73 20  |may ignore this |
    000004d0  66 6c 61 67 20 61 6e 64  20 62 6f 6f 6b 6d 61 72  |flag and bookmar|
    000004e0  6b 73 20 61 72 65 20 73  65 6e 74 20 61 74 20 74  |ks are sent at t|
    000004f0  68 65 20 73 65 72 76 65  72 27 73 20 64 69 73 63  |he server's disc|
    00000500  72 65 74 69 6f 6e 2e 20  43 6c 69 65 6e 74 73 20  |retion. Clients |
    00000510  73 68 6f 75 6c 64 20 6e  6f 74 20 61 73 73 75 6d  |should not assum|
    00000520  65 20 62 6f 6f 6b 6d 61  72 6b 73 20 61 72 65 20  |e bookmarks are |
    00000530  72 65 74 75 72 6e 65 64  20 61 74 20 61 6e 79 20  |returned at any |
    00000540  73 70 65 63 69 66 69 63  20 69 6e 74 65 72 76 61  |specific interva|
    00000550  6c 2c 20 6e 6f 72 20 6d  61 79 20 74 68 65 79 20  |l, nor may they |
    00000560  61 73 73 75 6d 65 20 74  68 65 20 73 65 72 76 65  |assume the serve|
    00000570  72 20 77 69 6c 6c 20 73  65 6e 64 20 61 6e 79 20  |r will send any |
    00000580  42 4f 4f 4b 4d 41 52 4b  20 65 76 65 6e 74 20 64  |BOOKMARK event d|
    00000590  75 72 69 6e 67 20 61 20  73 65 73 73 69 6f 6e 2e  |uring a session.|
    000005a0  20 49 66 20 74 68 69 73  20 69 73 20 6e 6f 74 20  | If this is not |
    000005b0  61 20 77 61 74 63 68 2c  20 74 68 69 73 20 66 69  |a watch, this fi|
    000005c0  65 6c 64 20 69 73 20 69  67 6e 6f 72 65 64 2e 20  |eld is ignored. |
    000005d0  49 66 20 74 68 65 20 66  65 61 74 75 72 65 20 67  |If the feature g|
    000005e0  61 74 65 20 57 61 74 63  68 42 6f 6f 6b 6d 61 72  |ate WatchBookmar|
    000005f0  6b 73 20 69 73 20 6e 6f  74 20 65 6e 61 62 6c 65  |ks is not enable|
    00000600  64 20 69 6e 20 61 70 69  73 65 72 76 65 72 2c 20  |d in apiserver, |
    00000610  74 68 69 73 20 66 69 65  6c 64 20 69 73 20 69 67  |this field is ig|
    00000620  6e 6f 72 65 64 2e 0a 0a  54 68 69 73 20 66 69 65  |nored...This fie|
    00000630  6c 64 20 69 73 20 62 65  74 61 2e 22 13 61 6c 6c  |ld is beta.".all|
    00000640  6f 77 57 61 74 63 68 42  6f 6f 6b 6d 61 72 6b 73  |owWatchBookmarks|
    00000650  32 07 62 6f 6f 6c 65 61  6e a0 01 01 4a ef 09 0a  |2.boolean...J...|
    00000660  ec 09 12 e9 09 1a e6 09  12 05 71 75 65 72 79 1a  |..........query.|
    00000670  c7 09 54 68 65 20 63 6f  6e 74 69 6e 75 65 20 6f  |..The continue o|
    00000680  70 74 69 6f 6e 20 73 68  6f 75 6c 64 20 62 65 20  |ption should be |
    00000690  73 65 74 20 77 68 65 6e  20 72 65 74 72 69 65 76  |set when retriev|
    000006a0  69 6e 67 20 6d 6f 72 65  20 72 65 73 75 6c 74 73  |ing more results|
    000006b0  20 66 72 6f 6d 20 74 68  65 20 73 65 72 76 65 72  | from the server|
    000006c0  2e 20 53 69 6e 63 65 20  74 68 69 73 20 76 61 6c  |. Since this val|
    000006d0  75 65 20 69 73 20 73 65  72 76 65 72 20 64 65 66  |ue is server def|
    000006e0  69 6e 65 64 2c 20 63 6c  69 65 6e 74 73 20 6d 61  |ined, clients ma|
    000006f0  79 20 6f 6e 6c 79 20 75  73 65 20 74 68 65 20 63  |y only use the c|
    00000700  6f 6e 74 69 6e 75 65 20  76 61 6c 75 65 20 66 72  |ontinue value fr|
    00000710  6f 6d 20 61 20 70 72 65  76 69 6f 75 73 20 71 75  |om a previous qu|
    00000720  65 72 79 20 72 65 73 75  6c 74 20 77 69 74 68 20  |ery result with |
    00000730  69 64 65 6e 74 69 63 61  6c 20 71 75 65 72 79 20  |identical query |
    00000740  70 61 72 61 6d 65 74 65  72 73 20 28 65 78 63 65  |parameters (exce|
    00000750  70 74 20 66 6f 72 20 74  68 65 20 76 61 6c 75 65  |pt for the value|
    00000760  20 6f 66 20 63 6f 6e 74  69 6e 75 65 29 20 61 6e  | of continue) an|
    00000770  64 20 74 68 65 20 73 65  72 76 65 72 20 6d 61 79  |d the server may|
    00000780  20 72 65 6a 65 63 74 20  61 20 63 6f 6e 74 69 6e  | reject a contin|
    00000790  75 65 20 76 61 6c 75 65  20 69 74 20 64 6f 65 73  |ue value it does|
    000007a0  20 6e 6f 74 20 72 65 63  6f 67 6e 69 7a 65 2e 20  | not recognize. |
    000007b0  49 66 20 74 68 65 20 73  70 65 63 69 66 69 65 64  |If the specified|
    000007c0  20 63 6f 6e 74 69 6e 75  65 20 76 61 6c 75 65 20  | continue value |
    000007d0  69 73 20 6e 6f 20 6c 6f  6e 67 65 72 20 76 61 6c  |is no longer val|
    000007e0  69 64 20 77 68 65 74 68  65 72 20 64 75 65 20 74  |id whether due t|
    000007f0  6f 20 65 78 70 69 72 61  74 69 6f 6e 20 28 67 65  |o expiration (ge|
    00000800  6e 65 72 61 6c 6c 79 20  66 69 76 65 20 74 6f 20  |nerally five to |
    00000810  66 69 66 74 65 65 6e 20  6d 69 6e 75 74 [truncated 16871338 chars]
    I1001 16:07:07.623872   14169 round_trippers.go:423] curl -k -v -XGET  -H "Accept: application/json" -H "User-Agent: kubectl/v1.19.2 (linux/amd64) kubernetes/f574309" 'https://34.89.39.224/api/v1/namespaces/tekton-pipelines/configmaps/config-artifact-pvc'
    I1001 16:07:07.629940   14169 round_trippers.go:443] GET https://34.89.39.224/api/v1/namespaces/tekton-pipelines/configmaps/config-artifact-pvc 200 OK in 6 milliseconds
    I1001 16:07:07.629963   14169 round_trippers.go:449] Response Headers:
    I1001 16:07:07.629969   14169 round_trippers.go:452]     Audit-Id: 994a6f26-3c47-444a-bb64-107f3722f9a1
    I1001 16:07:07.629973   14169 round_trippers.go:452]     Cache-Control: no-cache, private
    I1001 16:07:07.629980   14169 round_trippers.go:452]     Content-Type: application/json
    I1001 16:07:07.629996   14169 round_trippers.go:452]     Content-Length: 733
    I1001 16:07:07.630000   14169 round_trippers.go:452]     Date: Thu, 01 Oct 2020 16:07:07 GMT
    I1001 16:07:07.630030   14169 request.go:1097] Response Body: {"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"config-artifact-pvc","namespace":"tekton-pipelines","selfLink":"/api/v1/namespaces/tekton-pipelines/configmaps/config-artifact-pvc","uid":"be63ea53-aefc-4dad-b46a-dc0ef8487ff6","resourceVersion":"58225560","creationTimestamp":"2020-10-01T14:03:58Z","labels":{"app.kubernetes.io/instance":"default","app.kubernetes.io/part-of":"tekton-pipelines"},"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"v1\",\"kind\":\"ConfigMap\",\"metadata\":{\"annotations\":{},\"labels\":{\"app.kubernetes.io/instance\":\"default\",\"app.kubernetes.io/part-of\":\"tekton-pipelines\"},\"name\":\"config-artifact-pvc\",\"namespace\":\"tekton-pipelines\"}}\n"}}}
    I1001 16:07:07.630303   14169 request.go:1097] Request Body: {"data":{"size":"5GiB","storageClassName":"standard"},"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"v1\",\"data\":{\"size\":\"5GiB\",\"storageClassName\":\"standard\"},\"kind\":\"ConfigMap\",\"metadata\":{\"annotations\":{},\"name\":\"config-artifact-pvc\",\"namespace\":\"tekton-pipelines\"}}\n"},"labels":null}}
    I1001 16:07:07.630356   14169 round_trippers.go:423] curl -k -v -XPATCH  -H "Accept: application/json" -H "Content-Type: application/strategic-merge-patch+json" -H "User-Agent: kubectl/v1.19.2 (linux/amd64) kubernetes/f574309" 'https://34.89.39.224/api/v1/namespaces/tekton-pipelines/configmaps/config-artifact-pvc?fieldManager=kubectl-client-side-apply'
    I1001 16:07:37.635788   14169 round_trippers.go:443] PATCH https://34.89.39.224/api/v1/namespaces/tekton-pipelines/configmaps/config-artifact-pvc?fieldManager=kubectl-client-side-apply 504 Gateway Timeout in 30005 milliseconds
    I1001 16:07:37.635820   14169 round_trippers.go:449] Response Headers:
    I1001 16:07:37.635826   14169 round_trippers.go:452]     Content-Type: application/json
    I1001 16:07:37.635831   14169 round_trippers.go:452]     Content-Length: 187
    I1001 16:07:37.635835   14169 round_trippers.go:452]     Date: Thu, 01 Oct 2020 16:07:37 GMT
    I1001 16:07:37.635840   14169 round_trippers.go:452]     Audit-Id: b4938603-8393-49f5-b029-42860a3f7554
    I1001 16:07:37.635844   14169 round_trippers.go:452]     Cache-Control: no-cache, private
    I1001 16:07:37.635876   14169 request.go:1097] Response Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"Timeout: request did not complete within requested timeout 30s","reason":"Timeout","details":{},"code":504}
    I1001 16:07:37.636327   14169 round_trippers.go:423] curl -k -v -XGET  -H "Accept: application/json" -H "User-Agent: kubectl/v1.19.2 (linux/amd64) kubernetes/f574309" 'https://34.89.39.224/apis/tekton.dev/v1beta1/clustertasks/git-clone'
    I1001 16:07:37.687022   14169 round_trippers.go:443] GET https://34.89.39.224/apis/tekton.dev/v1beta1/clustertasks/git-clone 404 Not Found in 50 milliseconds
    I1001 16:07:37.687045   14169 round_trippers.go:449] Response Headers:
    I1001 16:07:37.687051   14169 round_trippers.go:452]     Content-Type: application/json
    I1001 16:07:37.687056   14169 round_trippers.go:452]     Content-Length: 234
    I1001 16:07:37.687060   14169 round_trippers.go:452]     Date: Thu, 01 Oct 2020 16:07:37 GMT
    I1001 16:07:37.687064   14169 round_trippers.go:452]     Audit-Id: 0f5bc3d2-ef55-4fda-900b-8ca297e8a533
    I1001 16:07:37.687068   14169 round_trippers.go:452]     Cache-Control: no-cache, private
    I1001 16:07:37.688093   14169 request.go:1097] Response Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"clustertasks.tekton.dev \"git-clone\" not found","reason":"NotFound","details":{"name":"git-clone","group":"tekton.dev","kind":"clustertasks"},"code":404}
    I1001 16:07:37.688347   14169 request.go:1097] Request Body: {"apiVersion":"tekton.dev/v1beta1","kind":"ClusterTask","metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"tekton.dev/v1beta1\",\"kind\":\"ClusterTask\",\"metadata\":{\"annotations\":{},\"name\":\"git-clone\"},\"spec\":{\"params\":[{\"description\":\"the url to the project git repository to clone\",\"name\":\"url\",\"type\":\"string\"},{\"default\":\"/\",\"description\":\"the subdirectory within the workspace to clone the project within\",\"name\":\"subdirectory\",\"type\":\"string\"},{\"description\":\"the revision/branch within the project to checkout\",\"name\":\"revision\",\"type\":\"string\"}],\"stepTemplate\":{\"image\":\"alpine/git\"},\"steps\":[{\"args\":[\"clone $(params.url) $(workspaces.shared-workspace.path)$(params.subdirectory)\"],\"name\":\"git-clone\"},{\"args\":[\"checkout $(params.revision)\"],\"name\":\"git-checkout\"}],\"workspaces\":[{\"name\":\"shared-workspace\"}]}}\n"},"name":"git-clone"},"spec":{"params":[{"description":"the url to the project git repository to clone","name":"url","type":"string"},{"default":"/","description":"the subdirectory within the workspace to clone the project within","name":"subdirectory","type":"string"},{"description":"the revision/branch within the project to checkout","name":"revision","type":"string"}],"stepTemplate":{"image":"alpine/git"},"steps":[{"args":["clone $(params.url) $(workspaces.shared-workspace.path)$(params.subdirectory)"],"name":"git-clone"},{"args":["checkout $(params.revision)"],"name":"git-checkout"}],"workspaces":[{"name":"shared-workspace"}]}}
    I1001 16:07:37.688418   14169 round_trippers.go:423] curl -k -v -XPOST  -H "Accept: application/json" -H "Content-Type: application/json" -H "User-Agent: kubectl/v1.19.2 (linux/amd64) kubernetes/f574309" 'https://34.89.39.224/apis/tekton.dev/v1beta1/clustertasks?fieldManager=kubectl-client-side-apply'
    I1001 16:08:07.724498   14169 round_trippers.go:443] POST https://34.89.39.224/apis/tekton.dev/v1beta1/clustertasks?fieldManager=kubectl-client-side-apply 500 Internal Server Error in 30036 milliseconds
    I1001 16:08:07.724544   14169 round_trippers.go:449] Response Headers:
    I1001 16:08:07.724550   14169 round_trippers.go:452]     Audit-Id: 167acd3a-de2b-4fde-973e-4c0844b7303d
    I1001 16:08:07.724555   14169 round_trippers.go:452]     Cache-Control: no-cache, private
    I1001 16:08:07.724559   14169 round_trippers.go:452]     Content-Type: application/json
    I1001 16:08:07.724563   14169 round_trippers.go:452]     Content-Length: 517
    I1001 16:08:07.724567   14169 round_trippers.go:452]     Date: Thu, 01 Oct 2020 16:08:07 GMT
    I1001 16:08:07.724605   14169 request.go:1097] Response Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"Internal error occurred: failed calling webhook \"webhook.pipeline.tekton.dev\": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded","reason":"InternalError","details":{"causes":[{"message":"failed calling webhook \"webhook.pipeline.tekton.dev\": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded"}]},"code":500}
    I1001 16:08:07.724926   14169 round_trippers.go:423] curl -k -v -XGET  -H "User-Agent: kubectl/v1.19.2 (linux/amd64) kubernetes/f574309" -H "Accept: application/json" 'https://34.89.39.224/apis/tekton.dev/v1beta1/namespaces/tekton-pipelines/taskruns/checkout'
    I1001 16:08:07.764167   14169 round_trippers.go:443] GET https://34.89.39.224/apis/tekton.dev/v1beta1/namespaces/tekton-pipelines/taskruns/checkout 404 Not Found in 39 milliseconds
    I1001 16:08:07.764191   14169 round_trippers.go:449] Response Headers:
    I1001 16:08:07.764197   14169 round_trippers.go:452]     Audit-Id: 3fc69cbb-17be-40aa-8bae-05132e3c68d0
    I1001 16:08:07.764201   14169 round_trippers.go:452]     Cache-Control: no-cache, private
    I1001 16:08:07.764205   14169 round_trippers.go:452]     Content-Type: application/json
    I1001 16:08:07.764216   14169 round_trippers.go:452]     Content-Length: 224
    I1001 16:08:07.764221   14169 round_trippers.go:452]     Date: Thu, 01 Oct 2020 16:08:07 GMT
    I1001 16:08:07.765057   14169 request.go:1097] Response Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"taskruns.tekton.dev \"checkout\" not found","reason":"NotFound","details":{"name":"checkout","group":"tekton.dev","kind":"taskruns"},"code":404}
    I1001 16:08:07.765213   14169 round_trippers.go:423] curl -k -v -XGET  -H "Accept: application/json" -H "User-Agent: kubectl/v1.19.2 (linux/amd64) kubernetes/f574309" 'https://34.89.39.224/api/v1/namespaces/tekton-pipelines'
    I1001 16:08:07.801674   14169 round_trippers.go:443] GET https://34.89.39.224/api/v1/namespaces/tekton-pipelines 200 OK in 36 milliseconds
    I1001 16:08:07.801697   14169 round_trippers.go:449] Response Headers:
    I1001 16:08:07.801703   14169 round_trippers.go:452]     Audit-Id: 7bd0590e-2f5b-4804-a6bd-c1da316089a6
    I1001 16:08:07.801714   14169 round_trippers.go:452]     Cache-Control: no-cache, private
    I1001 16:08:07.801718   14169 round_trippers.go:452]     Content-Type: application/json
    I1001 16:08:07.801722   14169 round_trippers.go:452]     Content-Length: 695
    I1001 16:08:07.801726   14169 round_trippers.go:452]     Date: Thu, 01 Oct 2020 16:08:07 GMT
    I1001 16:08:07.801750   14169 request.go:1097] Response Body: {"kind":"Namespace","apiVersion":"v1","metadata":{"name":"tekton-pipelines","selfLink":"/api/v1/namespaces/tekton-pipelines","uid":"9abe61fc-3fca-4511-b14b-abb5f3daf3e9","resourceVersion":"58225318","creationTimestamp":"2020-10-01T14:03:32Z","labels":{"app.kubernetes.io/instance":"default","app.kubernetes.io/part-of":"tekton-pipelines"},"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"v1\",\"kind\":\"Namespace\",\"metadata\":{\"annotations\":{},\"labels\":{\"app.kubernetes.io/instance\":\"default\",\"app.kubernetes.io/part-of\":\"tekton-pipelines\"},\"name\":\"tekton-pipelines\"}}\n"}},"spec":{"finalizers":["kubernetes"]},"status":{"phase":"Active"}}
    I1001 16:08:07.801924   14169 request.go:1097] Request Body: {"apiVersion":"tekton.dev/v1beta1","kind":"TaskRun","metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"tekton.dev/v1beta1\",\"kind\":\"TaskRun\",\"metadata\":{\"annotations\":{},\"name\":\"checkout\",\"namespace\":\"tekton-pipelines\"},\"spec\":{\"params\":[{\"name\":\"url\",\"value\":\"git@github.com:sauce-consortia/zookeeper.git\"},{\"name\":\"revision\",\"value\":\"master\"}],\"taskRef\":{\"kind\":\"ClusterTask\",\"name\":\"git-clone\"},\"workspaces\":[{\"emptyDir\":{},\"name\":\"shared-workspace\"}]}}\n"},"name":"checkout","namespace":"tekton-pipelines"},"spec":{"params":[{"name":"url","value":"git@github.com:sauce-consortia/zookeeper.git"},{"name":"revision","value":"master"}],"taskRef":{"kind":"ClusterTask","name":"git-clone"},"workspaces":[{"emptyDir":{},"name":"shared-workspace"}]}}
    I1001 16:08:07.802000   14169 round_trippers.go:423] curl -k -v -XPOST  -H "Accept: application/json" -H "Content-Type: application/json" -H "User-Agent: kubectl/v1.19.2 (linux/amd64) kubernetes/f574309" 'https://34.89.39.224/apis/tekton.dev/v1beta1/namespaces/tekton-pipelines/taskruns?fieldManager=kubectl-client-side-apply'
    I1001 16:08:37.827242   14169 round_trippers.go:443] POST https://34.89.39.224/apis/tekton.dev/v1beta1/namespaces/tekton-pipelines/taskruns?fieldManager=kubectl-client-side-apply 500 Internal Server Error in 30025 milliseconds
    I1001 16:08:37.827277   14169 round_trippers.go:449] Response Headers:
    I1001 16:08:37.827284   14169 round_trippers.go:452]     Cache-Control: no-cache, private
    I1001 16:08:37.827289   14169 round_trippers.go:452]     Content-Type: application/json
    I1001 16:08:37.827293   14169 round_trippers.go:452]     Content-Length: 517
    I1001 16:08:37.827297   14169 round_trippers.go:452]     Date: Thu, 01 Oct 2020 16:08:37 GMT
    I1001 16:08:37.827301   14169 round_trippers.go:452]     Audit-Id: 6864096c-1fba-47f3-80e5-efb76a839ec4
    I1001 16:08:37.827360   14169 request.go:1097] Response Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"Internal error occurred: failed calling webhook \"webhook.pipeline.tekton.dev\": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded","reason":"InternalError","details":{"causes":[{"message":"failed calling webhook \"webhook.pipeline.tekton.dev\": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded"}]},"code":500}
    I1001 16:08:37.827614   14169 helpers.go:216] server response object: [{
    "kind": "Status",
    "apiVersion": "v1",
    "metadata": {},
    "status": "Failure",
    "message": "error when applying patch:\n{\"data\":{\"size\":\"5GiB\",\"storageClassName\":\"standard\"},\"metadata\":{\"annotations\":{\"kubectl.kubernetes.io/last-applied-configuration\":\"{\\\"apiVersion\\\":\\\"v1\\\",\\\"data\\\":{\\\"size\\\":\\\"5GiB\\\",\\\"storageClassName\\\":\\\"standard\\\"},\\\"kind\\\":\\\"ConfigMap\\\",\\\"metadata\\\":{\\\"annotations\\\":{},\\\"name\\\":\\\"config-artifact-pvc\\\",\\\"namespace\\\":\\\"tekton-pipelines\\\"}}\\n\"},\"labels\":null}}\nto:\nResource: \"/v1, Resource=configmaps\", GroupVersionKind: \"/v1, Kind=ConfigMap\"\nName: \"config-artifact-pvc\", Namespace: \"tekton-pipelines\"\nfor: \"STDIN\": Timeout: request did not complete within requested timeout 30s",
    "reason": "Timeout",
    "details": {},
    "code": 504
    }]
    I1001 16:08:37.827660   14169 helpers.go:216] server response object: [{
    "kind": "Status",
    "apiVersion": "v1",
    "metadata": {},
    "status": "Failure",
    "message": "error when creating \"STDIN\": Internal error occurred: failed calling webhook \"webhook.pipeline.tekton.dev\": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded",
    "reason": "InternalError",
    "details": {
    "causes": [
      {
        "message": "failed calling webhook \"webhook.pipeline.tekton.dev\": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded"
      }
    ]
    },
    "code": 500
    }]
    I1001 16:08:37.827808   14169 helpers.go:216] server response object: [{
    "kind": "Status",
    "apiVersion": "v1",
    "metadata": {},
    "status": "Failure",
    "message": "error when creating \"STDIN\": Internal error occurred: failed calling webhook \"webhook.pipeline.tekton.dev\": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded",
    "reason": "InternalError",
    "details": {
    "causes": [
      {
        "message": "failed calling webhook \"webhook.pipeline.tekton.dev\": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded"
      }
    ]
    },
    "code": 500
    }]
    F1001 16:08:37.827839   14169 helpers.go:115] Error from server (Timeout): error when applying patch:
    {"data":{"size":"5GiB","storageClassName":"standard"},"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"v1\",\"data\":{\"size\":\"5GiB\",\"storageClassName\":\"standard\"},\"kind\":\"ConfigMap\",\"metadata\":{\"annotations\":{},\"name\":\"config-artifact-pvc\",\"namespace\":\"tekton-pipelines\"}}\n"},"labels":null}}
    to:
    Resource: "/v1, Resource=configmaps", GroupVersionKind: "/v1, Kind=ConfigMap"
    Name: "config-artifact-pvc", Namespace: "tekton-pipelines"
    for: "STDIN": Timeout: request did not complete within requested timeout 30s
    Error from server (InternalError): error when creating "STDIN": Internal error occurred: failed calling webhook "webhook.pipeline.tekton.dev": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded
    Error from server (InternalError): error when creating "STDIN": Internal error occurred: failed calling webhook "webhook.pipeline.tekton.dev": Post https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s: context deadline exceeded
    goroutine 1 [running]:
    k8s.io/kubernetes/vendor/k8s.io/klog/v2.stacks(0xc00000e001, 0xc004110a00, 0x4ad, 0x4ff)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:996 +0xb9
    k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).output(0x2d11ca0, 0xc000000003, 0x0, 0x0, 0xc004126fc0, 0x2ae6619, 0xa, 0x73, 0x40b200)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:945 +0x191
    k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).printDepth(0x2d11ca0, 0x3, 0x0, 0x0, 0x2, 0xc000855ad8, 0x1, 0x1)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:718 +0x165
    k8s.io/kubernetes/vendor/k8s.io/klog/v2.FatalDepth(...)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1442
    k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.fatal(0xc004170480, 0x47f, 0x1)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:93 +0x1f0
    k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.checkErr(0x1e5c660, 0xc0004ea240, 0x1d05b48)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:177 +0x8b5
    k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util.CheckErr(...)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/util/helpers.go:115
    k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/apply.NewCmdApply.func1(0xc0002a3b80, 0xc00029acc0, 0x0, 0x3)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/cmd/apply/apply.go:178 +0x12b
    k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).execute(0xc0002a3b80, 0xc00029ac90, 0x3, 0x3, 0xc0002a3b80, 0xc00029ac90)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:846 +0x2c2
    k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0xc000259340, 0xc000072180, 0xc00003a0a0, 0x5)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:950 +0x375
    k8s.io/kubernetes/vendor/github.com/spf13/cobra.(*Command).Execute(...)
        /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/github.com/spf13/cobra/command.go:887
    main.main()
        _output/dockerized/go/src/k8s.io/kubernetes/cmd/kubectl/kubectl.go:49 +0x21d

goroutine 6 [chan receive]: k8s.io/kubernetes/vendor/k8s.io/klog/v2.(*loggingT).flushDaemon(0x2d11ca0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:1131 +0x8b created by k8s.io/kubernetes/vendor/k8s.io/klog/v2.init.0 /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/klog/v2/klog.go:416 +0xd8

goroutine 8 [select]: k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0x1d05a80, 0x1e5a0c0, 0xc000592540, 0x1, 0xc00007c0c0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:167 +0x149 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.JitterUntil(0x1d05a80, 0x12a05f200, 0x0, 0x1, 0xc00007c0c0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133 +0x98 k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait.Until(0x1d05a80, 0x12a05f200, 0xc00007c0c0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90 +0x4d created by k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/util/logs.InitLogs /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kubectl/pkg/util/logs/logs.go:51 +0x96

goroutine 14 [IO wait]: internal/poll.runtime_pollWait(0x7fdcdfbb9c78, 0x72, 0x1e5d2a0) /usr/local/go/src/runtime/netpoll.go:220 +0x55 internal/poll.(pollDesc).wait(0xc00069a118, 0x72, 0xc000457b00, 0x8f7, 0x8f7) /usr/local/go/src/internal/poll/fd_poll_runtime.go:87 +0x45 internal/poll.(pollDesc).waitRead(...) /usr/local/go/src/internal/poll/fd_poll_runtime.go:92 internal/poll.(FD).Read(0xc00069a100, 0xc000457b00, 0x8f7, 0x8f7, 0x0, 0x0, 0x0) /usr/local/go/src/internal/poll/fd_unix.go:159 +0x1b1 net.(netFD).Read(0xc00069a100, 0xc000457b00, 0x8f7, 0x8f7, 0x203000, 0x6524db, 0xc00068e4e0) /usr/local/go/src/net/fd_posix.go:55 +0x4f net.(conn).Read(0xc0002a0058, 0xc000457b00, 0x8f7, 0x8f7, 0x0, 0x0, 0x0) /usr/local/go/src/net/net.go:182 +0x8e crypto/tls.(atLeastReader).Read(0xc000493740, 0xc000457b00, 0x8f7, 0x8f7, 0x21f, 0x89a, 0xc00060d710) /usr/local/go/src/crypto/tls/conn.go:779 +0x62 bytes.(Buffer).ReadFrom(0xc00068e600, 0x1e58ac0, 0xc000493740, 0x40b605, 0x1a18300, 0x1b94480) /usr/local/go/src/bytes/buffer.go:204 +0xb1 crypto/tls.(Conn).readFromUntil(0xc00068e380, 0x1e5b600, 0xc0002a0058, 0x5, 0xc0002a0058, 0x20e) /usr/local/go/src/crypto/tls/conn.go:801 +0xf3 crypto/tls.(Conn).readRecordOrCCS(0xc00068e380, 0x0, 0x0, 0xc00060dd18) /usr/local/go/src/crypto/tls/conn.go:608 +0x115 crypto/tls.(Conn).readRecord(...) /usr/local/go/src/crypto/tls/conn.go:576 crypto/tls.(Conn).Read(0xc00068e380, 0xc000336000, 0x1000, 0x1000, 0x0, 0x0, 0x0) /usr/local/go/src/crypto/tls/conn.go:1252 +0x15f bufio.(Reader).Read(0xc000129620, 0xc00075e2d8, 0x9, 0x9, 0xc00060dd18, 0x1d06800, 0x97d8cb) /usr/local/go/src/bufio/bufio.go:227 +0x222 io.ReadAtLeast(0x1e588e0, 0xc000129620, 0xc00075e2d8, 0x9, 0x9, 0x9, 0xc00006e060, 0x0, 0x1e58ce0) /usr/local/go/src/io/io.go:314 +0x87 io.ReadFull(...) /usr/local/go/src/io/io.go:333 k8s.io/kubernetes/vendor/golang.org/x/net/http2.readFrameHeader(0xc00075e2d8, 0x9, 0x9, 0x1e588e0, 0xc000129620, 0x0, 0x0, 0xc0041627b0, 0x0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/frame.go:237 +0x89 k8s.io/kubernetes/vendor/golang.org/x/net/http2.(Framer).ReadFrame(0xc00075e2a0, 0xc0041627b0, 0x0, 0x0, 0x0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/frame.go:492 +0xa5 k8s.io/kubernetes/vendor/golang.org/x/net/http2.(clientConnReadLoop).run(0xc00060dfa8, 0x0, 0x0) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/transport.go:1794 +0xd8 k8s.io/kubernetes/vendor/golang.org/x/net/http2.(ClientConn).readLoop(0xc0002f5680) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/transport.go:1716 +0x6f created by k8s.io/kubernetes/vendor/golang.org/x/net/http2.(Transport).newClientConn /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/transport.go:695 +0x66e

goroutine 67 [runnable]: k8s.io/kubernetes/vendor/golang.org/x/net/http2.(clientStream).awaitRequestCancel(0xc004164840, 0xc00054eb00) /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/transport.go:334 created by k8s.io/kubernetes/vendor/golang.org/x/net/http2.(clientConnReadLoop).handleResponse /workspace/anago-v1.19.2-rc.0.12+19706d90d87784/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/golang.org/x/net/http2/transport.go:2029 +0x768


# Additional Info

- Kubernetes version:

  **Output of `kubectl version`:**

Client Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.2", GitCommit:"f5743093fd1c663cb0cbc89748f730662345d44d", GitTreeState:"clean", BuildDate:"2020-09-16T13:41:02Z", GoVersion:"go1.15", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"16+", GitVersion:"v1.16.13-gke.401", GitCommit:"eb94c181eea5290e9da1238db02cfef263542f5f", GitTreeState:"clean", BuildDate:"2020-09-09T00:57:35Z", GoVersion:"go1.13.9b4", Compiler:"gc", Platform:"linux/amd64"}


- Tekton Pipeline version:

  **Output of `tkn version` or `kubectl get pods -n tekton-pipelines -l app=tekton-pipelines-controller -o=jsonpath='{.items[0].metadata.labels.version}'`**

v0.16.3

kubectl -n kube-system get all NAME READY STATUS RESTARTS AGE pod/event-exporter-gke-8489df9489-n9g2x 2/2 Running 0 20d pod/fluentd-gke-8pjpw 2/2 Running 1 42d pod/fluentd-gke-q6wpb 2/2 Running 2 42d pod/fluentd-gke-scaler-cd4d654d7-9jnfq 1/1 Running 0 115d pod/fluentd-gke-tqbdw 2/2 Running 1 42d pod/fluentd-gke-tsjqw 2/2 Running 5 42d pod/fluentd-gke-wmtfn 2/2 Running 1 42d pod/gke-metrics-agent-5plf2 1/1 Running 0 19d pod/gke-metrics-agent-7rn29 1/1 Running 0 19d pod/gke-metrics-agent-crq2q 1/1 Running 0 19d pod/gke-metrics-agent-d7bwl 1/1 Running 0 19d pod/gke-metrics-agent-vxspp 1/1 Running 0 19d pod/kube-dns-7c976ddbdb-bss7s 4/4 Running 0 40d pod/kube-dns-7c976ddbdb-k2hgl 4/4 Running 0 40d pod/kube-dns-autoscaler-645f7d66cf-dlggx 1/1 Running 0 115d pod/kube-proxy-gke-sauce-dev-5822cd9e-h1zr 1/1 Running 0 115d pod/kube-proxy-gke-sauce-dev-65b3c7bd-7767 1/1 Running 0 115d pod/kube-proxy-gke-sauce-dev-65b3c7bd-bk6h 1/1 Running 0 106d pod/kube-proxy-gke-sauce-dev-9fd5ef58-1mg7 1/1 Running 0 105d pod/kube-proxy-gke-sauce-dev-9fd5ef58-76kx 1/1 Running 0 115d pod/l7-default-backend-678889f899-76zdw 1/1 Running 0 115d pod/metrics-server-v0.3.6-64655c969-lpj5n 2/2 Running 0 20d pod/prometheus-to-sd-bzk9f 1/1 Running 0 19d pod/prometheus-to-sd-c864t 1/1 Running 0 19d pod/prometheus-to-sd-hbj6r 1/1 Running 0 19d pod/prometheus-to-sd-l4fx9 1/1 Running 0 19d pod/prometheus-to-sd-qfx78 1/1 Running 0 19d pod/stackdriver-metadata-agent-cluster-level-5d7556ff66-7mkbk 2/2 Running 0 40d

NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE service/default-http-backend NodePort 10.20.0.40 80:32765/TCP 115d service/kube-dns ClusterIP 10.20.0.10 53/UDP,53/TCP 115d service/metrics-server ClusterIP 10.20.0.192 443/TCP 115d

NAME DESIRED CURRENT READY UP-TO-DATE AVAILABLE NODE SELECTOR AGE daemonset.apps/fluentd-gke 5 5 5 5 5 beta.kubernetes.io/os=linux 115d daemonset.apps/gke-metrics-agent 5 5 5 5 5 kubernetes.io/os=linux 115d daemonset.apps/gke-metrics-agent-windows 0 0 0 0 0 kubernetes.io/os=windows 115d daemonset.apps/kube-proxy 0 0 0 0 0 beta.kubernetes.io/os=linux,node.kubernetes.io/kube-proxy-ds-ready=true 115d daemonset.apps/metadata-proxy-v0.1 0 0 0 0 0 beta.kubernetes.io/os=linux,cloud.google.com/metadata-proxy-ready=true 115d daemonset.apps/nvidia-gpu-device-plugin 0 0 0 0 0 115d daemonset.apps/prometheus-to-sd 5 5 5 5 5 beta.kubernetes.io/os=linux 115d

NAME READY UP-TO-DATE AVAILABLE AGE deployment.apps/event-exporter-gke 1/1 1 1 115d deployment.apps/fluentd-gke-scaler 1/1 1 1 115d deployment.apps/kube-dns 2/2 2 2 115d deployment.apps/kube-dns-autoscaler 1/1 1 1 115d deployment.apps/l7-default-backend 1/1 1 1 115d deployment.apps/metrics-server-v0.3.6 1/1 1 1 115d deployment.apps/stackdriver-metadata-agent-cluster-level 1/1 1 1 115d

NAME DESIRED CURRENT READY AGE replicaset.apps/event-exporter-gke-59b99fdd9c 0 0 0 65d replicaset.apps/event-exporter-gke-6c56555957 0 0 0 115d replicaset.apps/event-exporter-gke-6c9d8bd8d8 0 0 0 79d replicaset.apps/event-exporter-gke-8489df9489 1 1 1 20d replicaset.apps/fluentd-gke-scaler-cd4d654d7 1 1 1 115d replicaset.apps/kube-dns-56d8cd994f 0 0 0 79d replicaset.apps/kube-dns-5c9ff9fc54 0 0 0 115d replicaset.apps/kube-dns-7c976ddbdb 2 2 2 42d replicaset.apps/kube-dns-autoscaler-645f7d66cf 1 1 1 115d replicaset.apps/l7-default-backend-678889f899 1 1 1 115d replicaset.apps/metrics-server-v0.3.6-64655c969 1 1 1 20d replicaset.apps/metrics-server-v0.3.6-7b7d6c7576 0 0 0 115d replicaset.apps/metrics-server-v0.3.6-7dbf8647f6 0 0 0 115d replicaset.apps/stackdriver-metadata-agent-cluster-level-55bf76579c 0 0 0 115d replicaset.apps/stackdriver-metadata-agent-cluster-level-5d7556ff66 1 1 1 42d replicaset.apps/stackdriver-metadata-agent-cluster-level-5d7c86856c 0 0 0 65d replicaset.apps/stackdriver-metadata-agent-cluster-level-75579c648d 0 0 0 79d replicaset.apps/stackdriver-metadata-agent-cluster-level-78f6bd5c76 0 0 0 79d replicaset.apps/stackdriver-metadata-agent-cluster-level-854694cdd5 0 0 0 115d



I have looked through [this previously closed issue](https://github.com/tektoncd/pipeline/issues/2877) but it did not provide any fix for the problem. I also notice the service is trying to resolve DNS to `https://tekton-pipelines-webhook.tekton-pipelines.svc:443/defaulting?timeout=30s` does the service DNS need to be `https://tekton-pipelines-webhook.tekton-pipelines.svc.cluster.local:443/defaulting?timeout=30s` i.e. to include `<SERVICE_NAME>.<NAMESPACE>.svc.cluster.local` or one of the other options for service DNS resolution as detailed here https://cloud.google.com/kubernetes-engine/docs/concepts/service-discovery?

It says Tekton is ready off-the-shelf for GKE, so I wouldn't expect any DNS changes to be made, but the DNS it is attempting to resolve does not look correct, or at least I have not ever used or seen it in like this before i.e. without anything after the `.svc`
fx2y commented 4 years ago

<SERVICE_NAME>.<NAMESPACE>.svc is correct name, and kubedns will correctly point to the right address without specifically add .cluster.local. The issue that you encounter is related with the inability of GKE control-plane to do webhook admission directly into pod port because of firewall rule. In tekton-pipelines case, control-plane need to directly access 8443 port. So, you need to setup firewall accordingly [1].

EDIT: This is the example to update the firewall rules for this case

gcloud compute firewall-rules update <FIREWALL_NAME> --allow <OTHER_PORT>,tcp:8443
scott-kawaguchi commented 4 years ago

@fx2y is correct I had same issue look for your clusters master's ingress firewall rule. In the cloud console it looks like this. gke-my-cluster-13e19556-master | Ingress | gke-my-cluster-13e19556-node | IP ranges: 172.16.X.X/28 | tcp:10250,443,15017,8443 | Allow | 1000 | default | Off add 8443

c5haw commented 4 years ago

That's brilliant. Thanks for that. I suspected something like this was the case, but could not work out where the firewall rule needed updating.

ishallbethat commented 3 years ago

Thanks. @scott-kawaguchi below i add more detailed info how to fix in case anyone also meets the problem.

firewall rule name | Ingress | <network tag for node tekton pods reside> | IP ranges: <master IP range> | tcp:10250,443,15017,8443 | Allow | 1000 | default | Off