emissary-ingress / emissary

open source Kubernetes-native API gateway for microservices built on the Envoy Proxy
https://www.getambassador.io
Apache License 2.0
4.37k stars 685 forks source link

AES need a lot of restart before it is going to running state. #2672

Closed attilajanko closed 4 years ago

attilajanko commented 4 years ago

Describe the bug After the AES deployment on an Azure AKS cluster aks... Ready agent v1.13.12 4.15.0-1082-azure docker://3.0.10+azure

during the ambassador-... pods starting there is a python error in the logs and get 500 internal server error from the /diag/.

2020-05-07 09:36:28 diagd 1.4.2 [P219TThreadPoolExecutor-0_4] INFO: 7C716B56-786E-426C-B03C-96554F086837: 127.0.0.1 "GET /ambassador/v0/diag/" START 2020-05-07 09:36:28 diagd 1.4.2 [P219TThreadPoolExecutor-0_4] ERROR: 'NoneType' object has no attribute 'overview' Traceback (most recent call last): File "/usr/lib/python3.7/site-packages/ambassador-0.0.0.dev0-py3.7.egg/ambassador_diag/diagd.py", line 233, in wrapper result = f(*args, reqid=reqid, **kwds) File "/usr/lib/python3.7/site-packages/ambassador-0.0.0.dev0-py3.7.egg/ambassador_diag/diagd.py", line 519, in show_overview ov = diag.overview(request, app.estats) AttributeError: 'NoneType' object has no attribute 'overview' 2020-05-07 09:36:28 diagd 1.4.2 [P219TThreadPoolExecutor-0_4] ERROR: 7C716B56-786E-426C-B03C-96554F086837: 127.0.0.1 "GET /ambassador/v0/diag/" 1ms 500 server error time="2020-05-07 09:36:28" level=error msg="Bad HTTP response" func=github.com/datawire/apro/cmd/amb-sidecar/devportal/server.HTTPGet.func1 file="github.com/datawire/apro@/cmd/amb-sidecar/devportal/server/fetch er.go:165" status_code=500 subsystem=fetcher url="http://127.0.0.1:8877/ambassador/v0/diag/?json=true" time="2020-05-07 09:36:28" level=error msg="HTTP error 500 from http://127.0.0.1:8877/ambassador/v0/diag/?json=true" func=github.com/datawire/apro/cmd/amb-sidecar/devportal/server.HTTPGet file="github.com/dataw ire/apro@/cmd/amb-sidecar/devportal/server/fetcher.go:172" subsystem=fetcher url="http://127.0.0.1:8877/ambassador/v0/diag/?json=true" time="2020-05-07 09:36:28" level=info msg="HTTP error 500 from http://127.0.0.1:8877/ambassador/v0/diag/?json=true" func="github.com/datawire/apro/cmd/amb-sidecar/devportal/server.(*fetcher)._retrieve" file="gi thub.com/datawire/apro@/cmd/amb-sidecar/devportal/server/fetcher.go:195"

To Reproduce Steps to reproduce the behavior:

  1. Just following this documentation: https://www.getambassador.io/docs/latest/topics/install/

Expected behavior A running, stable deployment, and pods which starting ~1 min.

Versions (please complete the following information):

Additional context The problem is persist, not helpful the "delete the pod", persist if I scale up and down the pods. Every new pod restart needs 7-8-10 restart before reach the running state.

attilajanko commented 4 years ago

Maybe the whole log for the pod can be helpful

`2020-05-07 09:36:22 kubewatch [9 TMainThread] 1.4.2 DEBUG: looking up ID for namespace default 2020-05-07 09:36:22 kubewatch [9 TMainThread] 1.4.2 DEBUG: cluster ID URL is d6e_id://860a3871-2493-11e9-9b08-3612afa7c1ba/default 2020-05-07 09:36:23 kubewatch [9 TMainThread] 1.4.2 DEBUG: IngressClass check got 404 2020-05-07 09:36:23 kubewatch [9 TMainThread] 1.4.2 DEBUG: Ambassador does not have permission to read IngressClass resources. To enable IngressClass support, configure RBAC to allow Ambassador to read IngressC lass resources, then restart the Ambassador pod. 2020-05-07 09:36:23 kubewatch [9 TMainThread] 1.4.2 DEBUG: CRD type definition not found for ambassadorinstallations.getambassador.io 2020-05-07 09:36:23 kubewatch [9 TMainThread] 1.4.2 DEBUG: ambassadorinstallations.getambassador.io CRD not available 2020-05-07 09:36:23 kubewatch [9 TMainThread] 1.4.2 DEBUG: cluster ID is 2a49b558-5f94-528e-b62d-10a0adb6457d (from namespace default) 2020-05-07 09:36:23 AMBASSADOR INFO starting with environment: 2020-05-07 09:36:23 AMBASSADOR INFO ==== AMBASSADOR_ADMIN_PORT=tcp://192.168.171.146:8877 AMBASSADOR_ADMIN_PORT_8877_TCP=tcp://192.168.171.146:8877 AMBASSADOR_ADMIN_PORT_8877_TCP_ADDR=192.168.171.146 AMBASSADOR_ADMIN_PORT_8877_TCP_PORT=8877 AMBASSADOR_ADMIN_PORT_8877_TCP_PROTO=tcp AMBASSADOR_ADMIN_SERVICE_HOST=192.168.171.146 AMBASSADOR_ADMIN_SERVICE_PORT=8877 AMBASSADOR_ADMIN_SERVICE_PORT_AMBASSADOR_ADMIN=8877 AMBASSADOR_ADMIN_URL=http://127.0.0.1:8877 AMBASSADOR_CLUSTER_ID=2a49b558-5f94-528e-b62d-10a0adb6457d AMBASSADOR_CONFIG_BASE_DIR=/ambassador AMBASSADOR_INTERNAL_URL=https://127.0.0.1:8443 AMBASSADOR_NAMESPACE=core AMBASSADOR_PORT=tcp://192.168.18.40:80 AMBASSADOR_PORT_80_TCP=tcp://192.168.18.40:80 AMBASSADOR_PORT_80_TCP_ADDR=192.168.18.40 AMBASSADOR_PORT_80_TCP_PORT=80 AMBASSADOR_PORT_80_TCP_PROTO=tcp AMBASSADOR_REDIS_PORT=tcp://192.168.221.1:6379 AMBASSADOR_REDIS_PORT_6379_TCP=tcp://192.168.221.1:6379 AMBASSADOR_REDIS_PORT_6379_TCP_ADDR=192.168.221.1 AMBASSADOR_REDIS_PORT_6379_TCP_PORT=6379 AMBASSADOR_REDIS_PORT_6379_TCP_PROTO=tcp AMBASSADOR_REDIS_SERVICE_HOST=192.168.221.1 AMBASSADOR_REDIS_SERVICE_PORT=6379 AMBASSADOR_SERVICE_HOST=192.168.18.40 AMBASSADOR_SERVICE_PORT=80 AMBASSADOR_SERVICE_PORT_HTTP=80 AMBASSADOR_URL=https://ambassador.default.svc.cluster.local 2020-05-07 09:36:23 AMBASSADOR INFO ==== 2020-05-07 09:36:23 AMBASSADOR INFO launching worker process 'ambex': 'ambex' '-ads' '8003' '/ambassador/envoy' 2020-05-07 09:36:23 AMBASSADOR INFO ambex is PID 195 2020-05-07 09:36:23 AMBASSADOR INFO launching worker process 'diagd': 'diagd' '/ambassador/snapshots' '/ambassador/bootstrap-ads.json' '/ambassador/envoy/envoy.json' '--notices' '/ambassador/notices.json' '--ki ck' 'kill -HUP 1' 2020-05-07 09:36:23 AMBASSADOR INFO diagd is PID 203 time="2020-05-07T09:36:23Z" level=info msg="Ambex 1.4.2 starting..." time="2020-05-07T09:36:23Z" level=info msg=Listening port=8003 time="2020-05-07T09:36:23Z" level=info msg="Wrote PID" file=ambex.pid pid=195 time="2020-05-07T09:36:23Z" level=info msg="Pushing snapshot v0" 2020-05-07 09:36:25 diagd 1.4.2 [P203TMainThread] INFO: thread count 17, listening on 0.0.0.0:8877 2020-05-07 09:36:25 diagd 1.4.2 [P203TMainThread] INFO: BOOT: Scout result {"latest_version": "1.0.0", "application": "aes", "cached": false, "timestamp": 1588844185.682892} [2020-05-07 09:36:25 +0000] [203] [INFO] Starting gunicorn 19.9.0 [2020-05-07 09:36:25 +0000] [203] [INFO] Listening at: http://0.0.0.0:8877 (203) [2020-05-07 09:36:25 +0000] [203] [INFO] Using worker: threads [2020-05-07 09:36:25 +0000] [219] [INFO] Booting worker with pid: 219 2020-05-07 09:36:25 diagd 1.4.2 [P219TAEW] INFO: starting Scout checker 2020-05-07 09:36:25 diagd 1.4.2 [P219TAEW] INFO: starting event watcher 2020-05-07 09:36:26 AMBASSADOR INFO diagd running 2020-05-07 09:36:26 AMBASSADOR INFO launching worker process 'watt': 'watt' '--port' '8002' '--notify' 'python /ambassador/post_update.py --watt ' '-s' 'service' '-s' 'ingresses' '-s' 'AuthService' '-s' 'Mappin g' '-s' 'Module' '-s' 'RateLimitService' '-s' 'TCPMapping' '-s' 'TLSContext' '-s' 'TracingService' '-s' 'ConsulResolver' '-s' 'KubernetesEndpointResolver' '-s' 'KubernetesServiceResolver' '-s' 'Host' '-s' 'LogS ervice' '--watch' 'python /ambassador/watch_hook.py' 2020-05-07 09:36:26 AMBASSADOR INFO watt is PID 228 2020-05-07 09:36:26 AMBASSADOR INFO launching worker process 'amb-sidecar': '/ambassador/sidecars/amb-sidecar' 2020-05-07 09:36:26 AMBASSADOR INFO amb-sidecar is PID 231 time="2020-05-07 09:36:26" level=info msg="Ambassador Edge Stack configuation loaded" func=github.com/datawire/apro/cmd/amb-sidecar/runner.runE file="github.com/datawire/apro@/cmd/amb-sidecar/runner/main.go:138" 2020/05/07 09:36:27 starting watt... 2020/05/07 09:36:27 kubebootstrap: starting 2020/05/07 09:36:27 consulwatchman: starting 2020/05/07 09:36:27 kubewatchman: starting 2020/05/07 09:36:27 aggregator: starting 2020/05/07 09:36:27 invoker: starting 2020/05/07 09:36:27 api: starting 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "service" in namespace "" 2020/05/07 09:36:27 api: snapshot server listening on: :8002 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "ingresses" in namespace "" I0507 09:36:27.203538 231 merged_client_builder.go:122] Using in-cluster configuration I0507 09:36:27.203717 231 merged_client_builder.go:164] Using in-cluster namespace 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "AuthService" in namespace "" time="2020-05-07 09:36:27" level=info msg="license_secret_watch: installing license secrets watcher..." func=github.com/datawire/apro/cmd/amb-sidecar/runner.runE file="github.com/datawire/apro@/cmd/amb-sidecar/ runner/main.go:242" time="2020-05-07 09:36:27" level=info msg="Loading content from git repo" func=github.com/datawire/apro/cmd/amb-sidecar/devportal/content.NewContent file="github.com/datawire/apro@/cmd/amb-sidecar/devportal/con tent/content.go:83" contentBranch=master contentSubdir=/ contentURL="https://github.com/datawire/devportal-content.git" subsystem=content time="2020-05-07 09:36:27" level=info msg="license_secret_watch: starting the AES secret core/ambassador-edge-stack watcher" func=github.com/datawire/apro/cmd/amb-sidecar/runner.runE.func4 file="github.com/data wire/apro@/cmd/amb-sidecar/runner/main.go:247" MAIN=license_secret_watch time="2020-05-07 09:36:27" level=info msg="license_secret_watch: watching license file \"/home/ambassador/.config/ambassador/license-key\"" func=github.com/datawire/apro/cmd/amb-sidecar/runner.runE.func3 file=" github.com/datawire/apro@/cmd/amb-sidecar/runner/main.go:231" MAIN=license_refresh time="2020-05-07 09:36:27" level=info msg="Creating watch on /home/ambassador/.config/ambassador/" func=github.com/datawire/apro/cmd/amb-sidecar/runner.triggerOnChange file="github.com/datawire/apro@/cmd/amb-si decar/runner/files.go:51" time="2020-05-07 09:36:27" level=error msg="Failed to create watch on /home/ambassador/.config/ambassador/: Changes might require a restart: no such file or directory" func=github.com/datawire/apro/cmd/amb-side car/runner.triggerOnChange file="github.com/datawire/apro@/cmd/amb-sidecar/runner/files.go:54" 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "Mapping" in namespace "" time="2020-05-07 09:36:27" level=info msg="initial count 0" func="github.com/datawire/apro/cmd/amb-sidecar/ratelimits.(RateLimitController).DoWatch" file="github.com/datawire/apro@/cmd/amb-sidecar/ratelimits/r ls_watch.go:81" MAIN=ratelimit_controller 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "Module" in namespace "" 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "RateLimitService" in namespace "" time="2020-05-07 09:36:27" level=info msg="remove /tmp/amb/config: no such file or directory" func="github.com/datawire/apro/cmd/amb-sidecar/ratelimits.(RateLimitController).DoWatch.func1" file="github.com/dat awire/apro@/cmd/amb-sidecar/ratelimits/rls_watch.go:134" MAIN=ratelimit_controller I0507 09:36:27.545298 231 reflector.go:120] Starting reflector (5m0s) from k8s.io/client-go@v0.0.0-20191016111102-bec269661e48/tools/cache/reflector.go:96 I0507 09:36:27.545309 231 reflector.go:158] Listing and watching from k8s.io/client-go@v0.0.0-20191016111102-bec269661e48/tools/cache/reflector.go:96 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "TCPMapping" in namespace "" time="2020-05-07 09:36:27" level=error msg="0 filters configured" func="github.com/datawire/apro/cmd/amb-sidecar/filters/controller.(Controller).Watch.func1" file="github.com/datawire/apro@/cmd/amb-sidecar/fil ters/controller/controller.go:157" MAIN=auth_controller I0507 09:36:27.699008 231 reflector.go:120] Starting reflector (5m0s) from k8s.io/client-go@v0.0.0-20191016111102-bec269661e48/tools/cache/reflector.go:96 I0507 09:36:27.699029 231 reflector.go:158] Listing and watching from k8s.io/client-go@v0.0.0-20191016111102-bec269661e48/tools/cache/reflector.go:96 I0507 09:36:27.699052 231 reflector.go:120] Starting reflector (5m0s) from k8s.io/client-go@v0.0.0-20191016111102-bec269661e48/tools/cache/reflector.go:96 I0507 09:36:27.699075 231 reflector.go:158] Listing and watching from k8s.io/client-go@v0.0.0-20191016111102-bec269661e48/tools/cache/reflector.go:96 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "TLSContext" in namespace "" 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "TracingService" in namespace "" 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "ConsulResolver" in namespace "" 2020/05/07 09:36:27 kubebootstrap: adding kubernetes watch for "KubernetesEndpointResolver" in namespace "" 2020/05/07 09:36:28 kubebootstrap: adding kubernetes watch for "KubernetesServiceResolver" in namespace "" 2020/05/07 09:36:28 kubebootstrap: adding kubernetes watch for "Host" in namespace "" 2020/05/07 09:36:28 kubebootstrap: adding kubernetes watch for "LogService" in namespace "" time="2020-05-07 09:36:28" level=warning msg="statsd is not in use" func=github.com/lyft/gostats.NewDefaultStore file="github.com/lyft/gostats@v0.2.6/stats.go:193" 2020-05-07 09:36:28 diagd 1.4.2 [P219TThreadPoolExecutor-0_4] INFO: 7C716B56-786E-426C-B03C-96554F086837: 127.0.0.1 "GET /ambassador/v0/diag/" START 2020-05-07 09:36:28 diagd 1.4.2 [P219TThreadPoolExecutor-0_4] ERROR: 'NoneType' object has no attribute 'overview' Traceback (most recent call last): File "/usr/lib/python3.7/site-packages/ambassador-0.0.0.dev0-py3.7.egg/ambassador_diag/diagd.py", line 233, in wrapper result = f(args, reqid=reqid, *kwds) File "/usr/lib/python3.7/site-packages/ambassador-0.0.0.dev0-py3.7.egg/ambassador_diag/diagd.py", line 519, in show_overview ov = diag.overview(request, app.estats) AttributeError: 'NoneType' object has no attribute 'overview' 2020-05-07 09:36:28 diagd 1.4.2 [P219TThreadPoolExecutor-0_4] ERROR: 7C716B56-786E-426C-B03C-96554F086837: 127.0.0.1 "GET /ambassador/v0/diag/" 1ms 500 server error time="2020-05-07 09:36:28" level=error msg="Bad HTTP response" func=github.com/datawire/apro/cmd/amb-sidecar/devportal/server.HTTPGet.func1 file="github.com/datawire/apro@/cmd/amb-sidecar/devportal/server/fetch er.go:165" status_code=500 subsystem=fetcher url="http://127.0.0.1:8877/ambassador/v0/diag/?json=true" time="2020-05-07 09:36:28" level=error msg="HTTP error 500 from http://127.0.0.1:8877/ambassador/v0/diag/?json=true" func=github.com/datawire/apro/cmd/amb-sidecar/devportal/server.HTTPGet file="github.com/dataw ire/apro@/cmd/amb-sidecar/devportal/server/fetcher.go:172" subsystem=fetcher url="http://127.0.0.1:8877/ambassador/v0/diag/?json=true" time="2020-05-07 09:36:28" level=info msg="HTTP error 500 from http://127.0.0.1:8877/ambassador/v0/diag/?json=true" func="github.com/datawire/apro/cmd/amb-sidecar/devportal/server.(fetcher)._retrieve" file="gi thub.com/datawire/apro@/cmd/amb-sidecar/devportal/server/fetcher.go:195" 2020/05/07 09:36:28 kubebootstrap: found 0 "TCPMapping" in namespace "" 2020/05/07 09:36:28 kubebootstrap: sent "TCPMapping" to 1 receivers 2020/05/07 09:36:28 kubebootstrap: found 0 "TracingService" in namespace "" 2020/05/07 09:36:28 kubebootstrap: sent "TracingService" to 1 receivers I0507 09:36:28.740660 231 leaderelection.go:241] attempting to acquire leader lease core/kale... time="2020-05-07 09:36:28" level=error msg="the server doesn't have a resource type \"projectcontrollers\"\nkale disabled: earlywatcher.WatchQuery(k8s.Query{Kind:\"projectcontrollers.getambassador.io\", Namespa ce:\"\", FieldSelector:\"\", LabelSelector:\"projects.getambassador.io/ambassador_id=default\", resourceType:k8s.ResourceType{Group:\"\", Version:\"\", Name:\"\", Kind:\"\", Namespaced:false}}, ...)\ngithub.com /datawire/apro/cmd/amb-sidecar/kale.blockUntilAProjectControllerExists\n\tgithub.com/datawire/apro@/cmd/amb-sidecar/kale/kale.go:180\ngithub.com/datawire/apro/cmd/amb-sidecar/kale.Setup.func1\n\tgithub.com/data wire/apro@/cmd/amb-sidecar/kale/kale.go:221\ngithub.com/datawire/apro/cmd/amb-sidecar/group.(Group).Go.func1\n\tgithub.com/datawire/apro@/cmd/amb-sidecar/group/group.go:102\ngithub.com/datawire/apro/cmd/amb-si decar/group.(llGroup).Go.func2\n\tgithub.com/datawire/apro@/cmd/amb-sidecar/group/ll_group.go:95\nruntime.goexit\n\truntime/asm_amd64.s:1357\nruntime error\ngithub.com/datawire/apro/cmd/amb-sidecar/kale.report RuntimeError\n\tgithub.com/datawire/apro@/cmd/amb-sidecar/kale/helpers.go:786\ngithub.com/datawire/apro/cmd/amb-sidecar/kale.Setup.func1\n\tgithub.com/datawire/apro@/cmd/amb-sidecar/kale/kale.go:223\ngithub.com /datawire/apro/cmd/amb-sidecar/group.(Group).Go.func1\n\tgithub.com/datawire/apro@/cmd/amb-sidecar/group/group.go:102\ngithub.com/datawire/apro/cmd/amb-sidecar/group.(llGroup).Go.func2\n\tgithub.com/datawire/ apro@/cmd/amb-sidecar/group/ll_group.go:95\nruntime.goexit\n\truntime/asm_amd64.s:1357" func=github.com/datawire/apro/cmd/amb-sidecar/kale._logErr file="github.com/datawire/apro@/cmd/amb-sidecar/kale/helpers.go :737" MAIN=kale_watcher I0507 09:36:28.759893 231 leaderelection.go:350] lock is held by ambassador-67d64b4b6b-jnpg9 and has not yet expired I0507 09:36:28.759913 231 leaderelection.go:246] failed to acquire lease core/kale 2020/05/07 09:36:28 kubebootstrap: found 0 "KubernetesEndpointResolver" in namespace "" 2020/05/07 09:36:28 kubebootstrap: sent "KubernetesEndpointResolver" to 1 receivers I0507 09:36:28.842214 231 leaderelection.go:241] attempting to acquire leader lease core/acmeclient... 2020/05/07 09:36:28 kubebootstrap: found 0 "LogService" in namespace "" 2020/05/07 09:36:28 kubebootstrap: sent "LogService" to 1 receivers I0507 09:36:28.848733 231 leaderelection.go:350] lock is held by ambassador-96dcbf8f5-5gsms and has not yet expired I0507 09:36:28.848748 231 leaderelection.go:246] failed to acquire lease core/acmeclient 2020/05/07 09:36:28 kubebootstrap: found 1 "AuthService" in namespace "" 2020/05/07 09:36:28 kubebootstrap: sent "AuthService" to 1 receivers 2020/05/07 09:36:29 kubebootstrap: found 0 "Module" in namespace "" 2020/05/07 09:36:29 kubebootstrap: sent "Module" to 1 receivers 2020/05/07 09:36:29 kubebootstrap: found 1 "RateLimitService" in namespace "" 2020/05/07 09:36:29 kubebootstrap: sent "RateLimitService" to 1 receivers 2020/05/07 09:36:29 kubebootstrap: found 0 "ingresses" in namespace "" 2020/05/07 09:36:29 kubebootstrap: sent "ingresses" to 1 receivers 2020/05/07 09:36:29 kubebootstrap: found 0 "KubernetesServiceResolver" in namespace "" 2020/05/07 09:36:29 kubebootstrap: sent "KubernetesServiceResolver" to 1 receivers 2020/05/07 09:36:29 kubebootstrap: found 30 "service" in namespace "" 2020/05/07 09:36:29 kubebootstrap: sent "service" to 1 receivers 2020/05/07 09:36:29 kubebootstrap: found 2 "Mapping" in namespace "" 2020/05/07 09:36:29 kubebootstrap: sent "Mapping" to 1 receivers 2020/05/07 09:36:29 kubebootstrap: found 0 "ConsulResolver" in namespace "" 2020/05/07 09:36:29 kubebootstrap: sent "ConsulResolver" to 1 receivers 2020/05/07 09:36:29 kubebootstrap: found 0 "TLSContext" in namespace "" 2020/05/07 09:36:29 kubebootstrap: sent "TLSContext" to 1 receivers 2020/05/07 09:36:29 kubebootstrap: found 1 "Host" in namespace "" 2020/05/07 09:36:29 kubebootstrap: sent "Host" to 1 receivers I0507 09:36:31.553918 231 leaderelection.go:350] lock is held by ambassador-67d64b4b6b-jnpg9 and has not yet expired I0507 09:36:31.553942 231 leaderelection.go:246] failed to acquire lease core/kale 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: reading edge-stack-mappings.yaml (/ambassador/init-config/edge-stack-mappings.yaml) 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: YAML: using C parser 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: YAML: using C dumper 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: Parsed pod labels: {'pod-template-hash': '67d64b4b6b', 'service': 'ambassador'} 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: Found Ambassador service: ambassador 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: Host portal-test setting up 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: Host portal-test: TLS secret name is portal-2dtest.aaa.bbb.ccc 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: Host portal-test: creating TLSContext portal-test-context 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: IRTLSContext setup good: <IRTLSContext portal-test-context.core: hosts ['portal-test.aaa.bbb.ccc'] secret portal-2dtes t.aaa.bbb.ccc> 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: Host portal-test: ACME private key name is https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: Host setup OK: 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: HostFactory: saving host 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: Intercept agent not active, skipping initialization 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: HostFactory: FTC True, host_count 1 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: IR: watching Edge Stack 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020-05-07 09:36:31 watch-hook INFO: Intercept agent not active, skipping finalization 2020/05/07 09:36:31 aggregator: watch hook stderr: 2020/05/07 09:36:31 aggregator: found 7 kubernetes watches 2020/05/07 09:36:31 aggregator: found 0 consul watches 2020/05/07 09:36:31 aggregator: waiting for k8s watch: secret|core|metadata.name=ambassador-edge-stack| 2020/05/07 09:36:31 aggregator: waiting for k8s watch: service|core||hostname=portal-test.aaa.bbb.ccc 2020/05/07 09:36:31 aggregator: waiting for k8s watch: secret|core||hostname=portal-test.aaa.bbb.ccc 2020/05/07 09:36:31 aggregator: waiting for k8s watch: secret|core|metadata.name=portal-2dtest.aaa.bbb.ccc| 2020/05/07 09:36:31 aggregator: waiting for k8s watch: secret|core|metadata.name=https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory| 2020/05/07 09:36:31 aggregator: waiting for k8s watch: secret|aaa.bbb.ccc|metadata.name=portal-2dtest| 2020/05/07 09:36:31 aggregator: waiting for k8s watch: TLSContext|core|| 2020/05/07 09:36:31 consulwatchman: processing 0 consul watches 2020/05/07 09:36:31 kubewatchman: processing 7 kubernetes watch specs 2020/05/07 09:36:31 kubewatchman: add kubernetes watcher kubernetes:secret|core|metadata.name=ambassador-edge-stack| 2020/05/07 09:36:31 kubewatchman: add kubernetes watcher kubernetes:service|core||hostname=portal-test.aaa.bbb.ccc 2020/05/07 09:36:31 kubewatchman: add kubernetes watcher kubernetes:secret|core||hostname=portal-test.aaa.bbb.ccc 2020/05/07 09:36:31 kubewatchman: add kubernetes watcher kubernetes:secret|core|metadata.name=portal-2dtest.aaa.bbb.ccc| 2020/05/07 09:36:31 kubewatchman: add kubernetes watcher kubernetes:secret|core|metadata.name=https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory| 2020/05/07 09:36:31 kubewatchman: add kubernetes watcher kubernetes:secret|aaa.bbb.ccc|metadata.name=portal-2dtest| 2020/05/07 09:36:31 kubewatchman: add kubernetes watcher kubernetes:TLSContext|core|| 2020/05/07 09:36:31 kubernetes:secret|core|metadata.name=ambassador-edge-stack|: starting 2020/05/07 09:36:31 kubernetes:service|core||hostname=portal-test.aaa.bbb.ccc: starting 2020/05/07 09:36:31 kubernetes:secret|core||hostname=portal-test.aaa.bbb.ccc: starting 2020/05/07 09:36:31 kubernetes:secret|core|metadata.name=portal-2dtest.aaa.bbb.ccc|: starting 2020/05/07 09:36:31 kubernetes:secret|core|metadata.name=https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory|: starting 2020/05/07 09:36:31 kubernetes:secret|aaa.bbb.ccc|metadata.name=portal-2dtest|: starting 2020/05/07 09:36:31 kubernetes:TLSContext|core||: starting 2020/05/07 09:36:32 kubernetes:secret|core|metadata.name=portal-2dtest.aaa.bbb.ccc|: found 1 "secret" in namespace "core" 2020/05/07 09:36:32 kubernetes:secret|core|metadata.name=portal-2dtest.aaa.bbb.ccc|: sent "secret" to receivers 2020/05/07 09:36:32 kubernetes:secret|core||hostname=portal-test.aaa.bbb.ccc: found 0 "secret" in namespace "core" 2020/05/07 09:36:32 kubernetes:secret|core||hostname=portal-test.aaa.bbb.ccc: sent "secret" to receivers 2020/05/07 09:36:32 kubernetes:secret|aaa.bbb.ccc|metadata.name=portal-2dtest|: found 0 "secret" in namespace "aaa.bbb.ccc" 2020/05/07 09:36:32 kubernetes:secret|aaa.bbb.ccc|metadata.name=portal-2dtest|: sent "secret" to receivers 2020/05/07 09:36:32 kubernetes:secret|core|metadata.name=https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory|: found 1 "secret" in namespace "core" 2020/05/07 09:36:32 kubernetes:secret|core|metadata.name=https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory|: sent "secret" to receivers 2020/05/07 09:36:32 kubernetes:secret|core|metadata.name=ambassador-edge-stack|: found 1 "secret" in namespace "core" 2020/05/07 09:36:32 kubernetes:secret|core|metadata.name=ambassador-edge-stack|: sent "secret" to receivers 2020/05/07 09:36:32 kubernetes:service|core||hostname=portal-test.aaa.bbb.ccc: found 0 "service" in namespace "core" 2020/05/07 09:36:32 kubernetes:service|core||hostname=portal-test.aaa.bbb.ccc: sent "service" to receivers 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:33 watch-hook INFO: reading edge-stack-mappings.yaml (/ambassador/init-config/edge-stack-mappings.yaml) 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:33 watch-hook INFO: YAML: using C parser 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:33 watch-hook INFO: YAML: using C dumper 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:33 watch-hook INFO: Parsed pod labels: {'pod-template-hash': '67d64b4b6b', 'service': 'ambassador'} 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:33 watch-hook INFO: Found Ambassador service: ambassador 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: Host portal-test setting up 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: Host portal-test: TLS secret name is portal-2dtest.aaa.bbb.ccc 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: Host portal-test: creating TLSContext portal-test-context 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: IRTLSContext setup good: <IRTLSContext portal-test-context.core: hosts ['portal-test.aaa.bbb.ccc'] secret portal-2dtes t.aaa.bbb.ccc> 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: Host portal-test: ACME private key name is https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: Host setup OK: 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: HostFactory: saving host 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: Intercept agent not active, skipping initialization 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: HostFactory: FTC True, host_count 1 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: IR: watching Edge Stack 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020-05-07 09:36:34 watch-hook INFO: Intercept agent not active, skipping finalization 2020/05/07 09:36:34 aggregator: watch hook stderr: 2020/05/07 09:36:34 aggregator: found 7 kubernetes watches 2020/05/07 09:36:34 aggregator: found 0 consul watches 2020/05/07 09:36:34 aggregator: initialized k8s watch: secret|core|metadata.name=ambassador-edge-stack| 2020/05/07 09:36:34 aggregator: initialized k8s watch: service|core||hostname=portal-test.aaa.bbb.ccc 2020/05/07 09:36:34 aggregator: initialized k8s watch: secret|core||hostname=portal-test.aaa.bbb.ccc 2020/05/07 09:36:34 aggregator: initialized k8s watch: secret|core|metadata.name=portal-2dtest.aaa.bbb.ccc| 2020/05/07 09:36:34 aggregator: initialized k8s watch: secret|core|metadata.name=https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory| 2020/05/07 09:36:34 kubewatchman: processing 7 kubernetes watch specs 2020/05/07 09:36:34 aggregator: initialized k8s watch: secret|aaa.bbb.ccc|metadata.name=portal-2dtest| 2020/05/07 09:36:34 aggregator: waiting for k8s watch: TLSContext|core|| 2020/05/07 09:36:34 consulwatchman: processing 0 consul watches 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: reading edge-stack-mappings.yaml (/ambassador/init-config/edge-stack-mappings.yaml) 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: YAML: using C parser 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: YAML: using C dumper 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: Parsed pod labels: {'pod-template-hash': '67d64b4b6b', 'service': 'ambassador'} 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: Found Ambassador service: ambassador 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: Host portal-test setting up 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: Host portal-test: TLS secret name is portal-2dtest.aaa.bbb.ccc 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: Host portal-test: creating TLSContext portal-test-context 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: IRTLSContext setup good: <IRTLSContext portal-test-context.core: hosts ['portal-test.aaa.bbb.ccc'] secret portal-2dtes t.aaa.bbb.ccc> 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: Host portal-test: ACME private key name is https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: Host setup OK: 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: HostFactory: saving host 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: Intercept agent not active, skipping initialization 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: HostFactory: FTC True, host_count 1 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: IR: watching Edge Stack 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020-05-07 09:36:35 watch-hook INFO: Intercept agent not active, skipping finalization 2020/05/07 09:36:36 aggregator: watch hook stderr: 2020/05/07 09:36:36 aggregator: found 7 kubernetes watches 2020/05/07 09:36:36 aggregator: found 0 consul watches 2020/05/07 09:36:36 aggregator: initialized k8s watch: secret|core|metadata.name=ambassador-edge-stack| 2020/05/07 09:36:36 aggregator: initialized k8s watch: service|core||hostname=portal-test.aaa.bbb.ccc 2020/05/07 09:36:36 aggregator: initialized k8s watch: secret|core||hostname=portal-test.aaa.bbb.ccc 2020/05/07 09:36:36 aggregator: initialized k8s watch: secret|core|metadata.name=portal-2dtest.aaa.bbb.ccc| 2020/05/07 09:36:36 aggregator: initialized k8s watch: secret|core|metadata.name=https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory| 2020/05/07 09:36:36 aggregator: initialized k8s watch: secret|aaa.bbb.ccc|metadata.name=portal-2dtest| 2020/05/07 09:36:36 aggregator: waiting for k8s watch: TLSContext|core|| 2020/05/07 09:36:36 kubewatchman: processing 7 kubernetes watch specs 2020/05/07 09:36:36 consulwatchman: processing 0 consul watches 2020/05/07 09:36:36 kubernetes:TLSContext|core||: found 0 "TLSContext" in namespace "core" 2020/05/07 09:36:36 kubernetes:TLSContext|core||: sent "TLSContext" to receivers I0507 09:36:36.672026 231 leaderelection.go:350] lock is held by ambassador-67d64b4b6b-jnpg9 and has not yet expired I0507 09:36:36.672045 231 leaderelection.go:246] failed to acquire lease core/kale 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:37 watch-hook INFO: reading edge-stack-mappings.yaml (/ambassador/init-config/edge-stack-mappings.yaml) 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:37 watch-hook INFO: YAML: using C parser 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:37 watch-hook INFO: YAML: using C dumper 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:37 watch-hook INFO: Parsed pod labels: {'pod-template-hash': '67d64b4b6b', 'service': 'ambassador'} 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:37 watch-hook INFO: Found Ambassador service: ambassador 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: Host portal-test setting up 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: Host portal-test: TLS secret name is portal-2dtest.aaa.bbb.ccc 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: Host portal-test: creating TLSContext portal-test-context 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: IRTLSContext setup good: <IRTLSContext portal-test-context.core: hosts ['portal-test.aaa.bbb.ccc'] secret portal-2dtes t.aaa.bbb.ccc> 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: Host portal-test: ACME private key name is https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: Host setup OK: 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: HostFactory: saving host 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: Intercept agent not active, skipping initialization 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: HostFactory: FTC True, host_count 1 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: IR: watching Edge Stack 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020-05-07 09:36:38 watch-hook INFO: Intercept agent not active, skipping finalization 2020/05/07 09:36:38 aggregator: watch hook stderr: 2020/05/07 09:36:38 aggregator: found 7 kubernetes watches 2020/05/07 09:36:38 aggregator: found 0 consul watches 2020/05/07 09:36:38 aggregator: initialized k8s watch: secret|core|metadata.name=ambassador-edge-stack| 2020/05/07 09:36:38 aggregator: initialized k8s watch: service|core||hostname=portal-test.aaa.bbb.ccc 2020/05/07 09:36:38 aggregator: initialized k8s watch: secret|core||hostname=portal-test.aaa.bbb.ccc 2020/05/07 09:36:38 aggregator: initialized k8s watch: secret|core|metadata.name=portal-2dtest.aaa.bbb.ccc| 2020/05/07 09:36:38 aggregator: initialized k8s watch: secret|core|metadata.name=https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory| 2020/05/07 09:36:38 aggregator: initialized k8s watch: secret|aaa.bbb.ccc|metadata.name=portal-2dtest| 2020/05/07 09:36:38 aggregator: initialized k8s watch: TLSContext|core|| 2020/05/07 09:36:38 aggregator: bootstrapped! 2020/05/07 09:36:38 kubewatchman: processing 7 kubernetes watch specs 2020/05/07 09:36:38 consulwatchman: processing 0 consul watches 2020/05/07 09:36:38 notify: python /ambassador/post_update.py --watt http://localhost:8002/snapshots/1 2020-05-07 09:36:39 diagd 1.4.2 [P219TThreadPoolExecutor-0_3] INFO: Update requested: watt, http://localhost:8002/snapshots/1 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: copying configuration: watt, http://localhost:8002/snapshots/1 to /ambassador/snapshots/snapshot-tmp.yaml 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: reading edge-stack-mappings.yaml (/ambassador/init-config/edge-stack-mappings.yaml) 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: YAML: using C parser 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: YAML: using C dumper 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: Parsed pod labels: {'pod-template-hash': '67d64b4b6b', 'service': 'ambassador'} 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: Found Ambassador service: ambassador 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: Host portal-test setting up 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: Host portal-test: TLS secret name is portal-2dtest.aaa.bbb.ccc 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: Host portal-test: creating TLSContext portal-test-context 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: IRTLSContext setup good: <IRTLSContext portal-test-context.core: hosts ['portal-test.aaa.bbb.ccc'] secret portal-2dtest.aaa.bbb.ccc> 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: Host portal-test: ACME private key name is https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] ERROR: Secret https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory.core unknown 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] ERROR: Host portal-test: continuing with invalid private key secret https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: Host setup OK: 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: HostFactory: saving host 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: Intercept agent not active, skipping initialization 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: HostFactory: FTC True, host_count 1 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: IR: starting Edge Stack 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: Intercept agent not active, skipping finalization 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: ListenerFactory: Host portal-test.aaa.bbb.ccc has TLS active, defaulting additionalPort to 8080 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: ListenerFactory: Host portal-test.aaa.bbb.ccc terminating TLS with context portal-test-context 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: adding mappings for Edge Policy Console 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: IR: cluster cluster_127_0_0_1_8500_core is the sidecar 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: V2Listener finalize {'name': 'ambassador-listener-8443', 'port': 8443, 'use_proxy_proto': False, 'vhosts': {'portal-test.aaa.bbb.ccc': '<VHost ctx <V2TLSCo ntext chain_file .../portal-2dtest.aaa.bbb.ccc/4279C836...> redir False a Route ia Route 54 routes>'}} 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: V2Listener finalize {'name': 'ambassador-listener-8080', 'port': 8080, 'use_proxy_proto': False, 'vhosts': {'portal-test.aaa.bbb.ccc': '<VHost ctx -none- r edir False a None ia Route 27 routes>'}} 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: -global-: NOTICE: A future Ambassador version will change the GRPC protocol version for AuthServices and RateLimitServices. See the CHANGELOG for details. 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: successfully validated the resulting envoy configuration, continuing... 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: saving Envoy configuration for snapshot 1 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: running 'kill -HUP 1' 2020-05-07 09:36:39 AMBASSADOR INFO launching worker process 'envoy': 'envoy' '-c' '/ambassador/bootstrap-ads.json' '--drain-time-s' '1' 2020-05-07 09:36:39 AMBASSADOR INFO envoy is PID 298 2020-05-07 09:36:39 AMBASSADOR INFO KICK: started Envoy as PID 298 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: configuration updated from snapshot 1 2020-05-07 09:36:39 diagd 1.4.2 [P219TAEW] INFO: starting Envoy status updater KubeStatus UPDATE 300: running command: ['kubestatus', 'Mapping', '-f', 'metadata.name=ambassador-devportal-api', '-n', 'core', '-u', '/dev/fd/0'] [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:249] initializing epoch 0 (hot restart version=11.104) [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:251] statically linked extensions: [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:253] access_loggers: envoy.file_access_log,envoy.http_grpc_access_log,envoy.tcp_grpc_access_log [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:256] filters.http: envoy.buffer,envoy.cors,envoy.csrf,envoy.ext_authz,envoy.fault,envoy.filters.http.adaptive_concurrency,envoy.filters.http .dynamic_forward_proxy,envoy.filters.http.grpc_http1_reverse_bridge,envoy.filters.http.grpc_stats,envoy.filters.http.header_to_metadata,envoy.filters.http.jwt_authn,envoy.filters.http.original_src,envoy.filters .http.rbac,envoy.filters.http.tap,envoy.grpc_http1_bridge,envoy.grpc_json_transcoder,envoy.grpc_web,envoy.gzip,envoy.health_check,envoy.http_dynamo_filter,envoy.ip_tagging,envoy.lua,envoy.rate_limit,envoy.route r,envoy.squash [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:259] filters.listener: envoy.listener.http_inspector,envoy.listener.original_dst,envoy.listener.original_src,envoy.listener.proxy_protocol,e nvoy.listener.tls_inspector [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:262] filters.network: envoy.client_ssl_auth,envoy.echo,envoy.ext_authz,envoy.filters.network.dubbo_proxy,envoy.filters.network.mysql_proxy,e nvoy.filters.network.rbac,envoy.filters.network.sni_cluster,envoy.filters.network.thrift_proxy,envoy.filters.network.zookeeper_proxy,envoy.http_connection_manager,envoy.mongo_proxy,envoy.ratelimit,envoy.redis_p roxy,envoy.tcp_proxy [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:264] stat_sinks: envoy.dog_statsd,envoy.metrics_service,envoy.stat_sinks.hystrix,envoy.statsd [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:266] tracers: envoy.dynamic.ot,envoy.lightstep,envoy.tracers.datadog,envoy.tracers.opencensus,envoy.tracers.xray,envoy.zipkin [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:269] transport_sockets.downstream: envoy.transport_sockets.alts,envoy.transport_sockets.raw_buffer,envoy.transport_sockets.tap,envoy.transpo rt_sockets.tls,raw_buffer,tls [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:272] transport_sockets.upstream: envoy.transport_sockets.alts,envoy.transport_sockets.raw_buffer,envoy.transport_sockets.tap,envoy.transport _sockets.tls,raw_buffer,tls [2020-05-07 09:36:39.794][298][info][main] [source/server/server.cc:278] buffer implementation: new [2020-05-07 09:36:39.797][298][info][main] [source/server/server.cc:344] admin address: 127.0.0.1:8001 time="2020-05-07 09:36:39" level=info msg="license_secret_watch: inspecting new snapshot: looking for core/ambassador-edge-stack within 3 secrets" func=github.com/datawire/apro/cmd/amb-sidecar/runner.runE.func4 file="github.com/datawire/apro@/cmd/amb-sidecar/runner/main.go:255" MAIN=license_secret_watch time="2020-05-07 09:36:39" level=info msg="license_secret_watch: ignoring secret core/https-3a-2f-2facme-2dv02.api.letsencrypt.org-2fdirectory" func=github.com/datawire/apro/cmd/amb-sidecar/runner.runE.func4 fi le="github.com/datawire/apro@/cmd/amb-sidecar/runner/main.go:282" MAIN=license_secret_watch time="2020-05-07 09:36:39" level=info msg="license_secret_watch: AES secret found (core/ambassador-edge-stack): getting license data" func=github.com/datawire/apro/cmd/amb-sidecar/runner.runE.func4 file="github .com/datawire/apro@/cmd/amb-sidecar/runner/main.go:265" MAIN=license_secret_watch time="2020-05-07 09:36:39" level=warning msg="license_secret_watch: empty decoded license data" func=github.com/datawire/apro/cmd/amb-sidecar/runner.runE.func4 file="github.com/datawire/apro@/cmd/amb-sidecar/ru nner/main.go:274" MAIN=license_secret_watch time="2020-05-07 09:36:39" level=info msg="license_secret_watch: ignoring secret core/portal-2dtest.aaa.bbb.ccc" func=github.com/datawire/apro/cmd/amb-sidecar/runner.runE.func4 file="github.com/datawire/apr o@/cmd/amb-sidecar/runner/main.go:282" MAIN=license_secret_watch [2020-05-07 09:36:39.798][298][info][main] [source/server/server.cc:458] runtime: layers:

attilajanko commented 4 years ago

Try to update the AKS version to 1.16.7, but the problem still exists.

ambassador-67d64b4b6b-ptfrh 0/1 CrashLoopBackOff 7 11m

...

2020-05-08 11:13:00 diagd 1.4.2 [P219TThreadPoolExecutor-0_1] INFO: C6416A76-81B4-4163-8304-D1ADB05522BF: 127.0.0.1 "GET /ambassador/v0/diag/" START 2020-05-08 11:13:00 diagd 1.4.2 [P219TThreadPoolExecutor-0_1] ERROR: 'NoneType' object has no attribute 'overview' Traceback (most recent call last): File "/usr/lib/python3.7/site-packages/ambassador-0.0.0.dev0-py3.7.egg/ambassador_diag/diagd.py", line 233, in wrapper result = f(*args, reqid=reqid, **kwds) File "/usr/lib/python3.7/site-packages/ambassador-0.0.0.dev0-py3.7.egg/ambassador_diag/diagd.py", line 519, in show_overview ov = diag.overview(request, app.estats) AttributeError: 'NoneType' object has no attribute 'overview' 2020-05-08 11:13:00 diagd 1.4.2 [P219TThreadPoolExecutor-0_1] ERROR: C6416A76-81B4-4163-8304-D1ADB05522BF: 127.0.0.1 "GET /ambassador/v0/diag/" 2ms 500 server error

After 12 restart and almost hafl an hour it is in Running state and working.

ambassador-67d64b4b6b-ptfrh 1/1 Running 12 28m

ACCESS [2020-05-08T11:18:53.326Z] "GET /a/b/c/d/e HTTP/1.1" 200 - 0 4161 8 5 "f.g.h.i" "-" "05f09cff-31ac-466e-ba20-baea4833d1a8" "j.k.l.m" "n.o.p.q:80"

attilajanko commented 4 years ago

Issue is exist in the version 1.5.0 too.

attilajanko commented 4 years ago

Issue is exist in the version 1.5.2 too.

grrywlsn commented 4 years ago

I'm also seeing this issue in 1.5.2

grrywlsn commented 4 years ago

@attilajanko do you have many Mapping objects? It seems if I clear that to a small number Ambassador will run successfully.

attilajanko commented 4 years ago

Hi @grrywlsn for last chance I deployed an new cluster in the Azure simply from the Marketplace nothing terraform, nothing arm template, just click-click create with version 1.15.11. After I following the manual steps

kubectl apply -f https://www.getambassador.io/yaml/aes-crds.yaml && \ kubectl wait --for condition=established --timeout=90s crd -lproduct=aes && \ kubectl apply -f https://www.getambassador.io/yaml/aes.yaml && \ kubectl -n ambassador wait --for condition=available --timeout=90s deploy -lproduct=aes

The log still contains the original error, but start without a lot of restarts time="2020-06-12 08:10:57" level=warning msg="statsd is not in use" 2020-06-12 08:10:57 diagd 1.3.2 [P111TThreadPoolExecutor-0_0] INFO: 28A91046-46A8-4238-BEE8-8D0D9A05FA83: 127.0.0.1 "GET /ambassador/v0/diag/" START 2020-06-12 08:10:57 diagd 1.3.2 [P111TThreadPoolExecutor-0_0] ERROR: 'NoneType' object has no attribute 'overview' Traceback (most recent call last): File "/usr/lib/python3.7/site-packages/ambassador-0.0.0.dev0-py3.7.egg/ambassador_diag/diagd.py", line 233, in wrapper result = f(*args, reqid=reqid, **kwds) File "/usr/lib/python3.7/site-packages/ambassador-0.0.0.dev0-py3.7.egg/ambassador_diag/diagd.py", line 519, in show_overview ov = diag.overview(request, app.estats) AttributeError: 'NoneType' object has no attribute 'overview' 2020-06-12 08:10:57 diagd 1.3.2 [P111TThreadPoolExecutor-0_0] ERROR: 28A91046-46A8-4238-BEE8-8D0D9A05FA83: 127.0.0.1 "GET /ambassador/v0/diag/" 2ms 500 server error time="2020-06-12 08:10:57" level=error msg="Bad HTTP response" status_code=500 subsystem=fetcher url="http://127.0.0.1:8877/ambassador/v0/diag/?json=true" time="2020-06-12 08:10:57" level=error msg="HTTP error 500 from http://127.0.0.1:8877/ambassador/v0/diag/?json=true" subsystem=fetcher url="http://127.0.0.1:8877/ambassador/v0/diag/?json=true" time="2020-06-12 08:10:57" level=info msg="HTTP error 500 from http://127.0.0.1:8877/ambassador/v0/diag/?json=true"

2020-06-12 08:11:57 diagd 1.3.2 [P111TThreadPoolExecutor-0_3] INFO: 91583D1E-CBD1-4BB6-883F-0AF2D94E16F8: 127.0.0.1 "GET /ambassador/v0/diag/" START 2020-06-12 08:11:57 diagd 1.3.2 [P111TThreadPoolExecutor-0_3] INFO: DEBUG_MODE False status_dict {'Error check': {'status': True, 'specifics': [(True, 'No errors logged')]}, 'TLS': {'status': True, 'specifics': [(True, '1 TLSContext is active')]}, 'Mappings': {'status': True, 'specifics': [(True, '5 Mappings are active')]}}

Regards Attila

attilajanko commented 4 years ago

And if you think this type of mapping at the moment only these exists in the new cluster

kubectl get mappings --all-namespaces NAMESPACE NAME PREFIX SERVICE STATE REASON ambassador ambassador-devportal /docs/ 127.0.0.1:8500 Running ambassador ambassador-devportal-api /openapi/ 127.0.0.1:8500 Running

But of course we have a lot of mappings in our services (in our existing configuration)

apiVersion: ambassador/v2 kind: Mapping name: abcd_mapping prefix: /api/v1/abcd rewrite: /api/v1/abcd service: : efgh timeout_ms: 60000

attilajanko commented 4 years ago

Following this: https://github.com/datawire/ambassador/issues/1784

I had the same issue during the start.

curl 127.0.0.1:8877/ambassador/v0/check_alive ambassador liveness check OK (6 seconds) curl 127.0.0.1:8877/ambassador/v0/check_ready ambassador waiting for config

attilajanko commented 4 years ago

Seems this configuration for liveness and readiness probe was helpful for us

initialDelaySeconds: 90 periodSeconds: 60 timeoutSeconds: 15 failureThreshold: 10 successThreshold: 1

After these settings the 500 error still exists, but no restart and the pod starting around ~5min

ambassador-7788d44cd7-7ndt2 1/1 Running 0 5m16s ambassador-86dbc79c74-64bmk 1/1 Running 0 4m2s