Open jcristol opened 3 years ago
Same here although I created a dummy certificate within Azure Keyvault via Generate/Import, I increased the log level to "Trace" see below
{"ts":1616649528797.6128,"msg":"started workers","v":0}
{"ts":1616649528797.7607,"msg":"processing azurekeyvaultsecret","v":4,"key":"vnext-backend/svrs-azr-dev-vital-test-cert"}
{"ts":1616649528797.7974,"msg":"get or create secret","v":4,"secret":"vnext-backend/nginx-cert"}
{"ts":1616649529050.7837,"msg":"values have changed requiring update to secret","v":0,"azurekeyvaultsecret":"vnext-backend/svrs-azr-dev-vital-test-cert","secret":"vnext-backend/nginx-cert"}
{"ts":1616649529051.0198,"msg":"Observed a panic: \"invalid memory address or nil pointer dereference\" (runtime error: invalid memory address or nil pointer dereference)\ngoroutine 83 [running]:\nk8s.io/apimachinery/pkg/util/runtime.logPanic(0x184ca20, 0x2713ba0)\n\t/go
/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/runtime/runtime.go:74 +0x95\nk8s.io/apimachinery/pkg/util/runtime.HandleCrash(0x0, 0x0, 0x0)\n\t/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/runtime/runtime.go:48 +0x89\npanic(0x184ca20, 0x2713ba0)\n\t/usr/local/go/src/ru
ntime/panic.go:969 +0x1b9\ngithub.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.createNewSecretFromExisting(0xc000435180, 0xc0002985a0, 0xc0006bec80, 0x0, 0x0, 0x1a87cc2)\n\t/go/src/github.com/SparebankenVest/azure-key-vault-t
o-kubernetes/cmd/azure-keyvault-controller/controller/secret.go:227 +0x6ef\ngithub.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).getOrCreateKubernetesSecret(0xc0004269c0, 0xc000435180, 0x2a, 0xc000435180, 0x0)\n\
t/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret.go:162 +0x7a9\ngithub.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).syncAzureKeyVaultSecret(0xc000
4269c0, 0xc0000a37d0, 0x2a, 0xc00046f4e0, 0x7277575479596a00)\n\t/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/azureKeyVaultSecret.go:168 +0x67b\nkmodules.xyz/client-go/tools/queue.(*Worker).processNextEntry(0xc
0002cca00, 0x203000)\n\t/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:84 +0xec\nkmodules.xyz/client-go/tools/queue.(*Worker).processQueue(0xc0002cca00)\n\t/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/
tools/queue/worker.go:67 +0x2b\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc0003f6030)\n\t/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:155 +0x5f\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0003f6030, 0x1c4d8a0, 0xc0002984b0, 0x6a684465
76386901, 0xc000082120)\n\t/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:156 +0xad\nk8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0003f6030, 0x3b9aca00, 0x0, 0x6b647a4c31426e01, 0xc000082120)\n\t/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wai
t.go:133 +0x98\nk8s.io/apimachinery/pkg/util/wait.Until(0xc0003f6030, 0x3b9aca00, 0xc000082120)\n\t/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:90 +0x4d\ncreated by kmodules.xyz/client-go/tools/queue.(*Worker).Run\n\t/go/pkg/mod/kmodules.xyz/client-go@v0
.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:53 +0x89\n","v":0}
panic: runtime error: invalid memory address or nil pointer dereference [recovered]
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x20 pc=0x16d9a6f]
goroutine 83 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0x0, 0x0, 0x0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/runtime/runtime.go:55 +0x10c
panic(0x184ca20, 0x2713ba0)
/usr/local/go/src/runtime/panic.go:969 +0x1b9
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.createNewSecretFromExisting(0xc000435180, 0xc0002985a0, 0xc0006bec80, 0x0, 0x0, 0x1a87cc2)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret.go:227 +0x6ef
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).getOrCreateKubernetesSecret(0xc0004269c0, 0xc000435180, 0x2a, 0xc000435180, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret.go:162 +0x7a9
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).syncAzureKeyVaultSecret(0xc0004269c0, 0xc0000a37d0, 0x2a, 0xc00046f4e0, 0x7277575479596a00)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/azureKeyVaultSecret.go:168 +0x67b
kmodules.xyz/client-go/tools/queue.(*Worker).processNextEntry(0xc0002cca00, 0x203000)
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:84 +0xec
kmodules.xyz/client-go/tools/queue.(*Worker).processQueue(0xc0002cca00)
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:67 +0x2b
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc0003f6030)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:155 +0x5f
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0003f6030, 0x1c4d8a0, 0xc0002984b0, 0x6a68446576386901, 0xc000082120)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:156 +0xad
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0003f6030, 0x3b9aca00, 0x0, 0x6b647a4c31426e01, 0xc000082120)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:133 +0x98
k8s.io/apimachinery/pkg/util/wait.Until(0xc0003f6030, 0x3b9aca00, 0xc000082120)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:90 +0x4d
created by kmodules.xyz/client-go/tools/queue.(*Worker).Run
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:53 +0x89
Hello, I'm not able to reproduce the issue myself. Could you guys (@jonescobedo , @jcristol) provide some more information?
Is there some special requirements in the certificate?
What version of the akv2k8s-controller are you using?
Does the certificate exists in the keyvault before creating the AzureKeyVaultSecret
resource in the cluster?
Hello, I'm not able to reproduce the issue myself. Could you guys (@jonescobedo , @jcristol) provide some more information?
I followed this guide: https://blog.baeke.info/2020/12/07/certificates-with-azure-key-vault-and-nginx-ingress-controller/
Is there some special requirements in the certificate?
The certificate is self-signed and is created in k8s as a secret object in a different namespace (nginx). The akv2k8s
chart has its own namespace as well
[
{
"SecretName": "nginx-cert",
"Namespace": "nginx",
"Version": 3,
"SerialNumber": "67b2dcf8ebf24224ae99d7c6035319df",
"Issuer": "CN=api.vitalnextdev.net",
"Validity": {
"NotBefore": "2021-03-23T04:57:28Z",
"NotAfter": "2022-03-23T05:07:28Z"
},
"Subject": "CN=api.vitalnextdev.net",
"IsCA": false
}
]
You can recreate the certificate with the command below, also the certificate.pem and private-key pem files are available here for download
cert_ns=nginx
kubectl -n $cert_ns create secret tls nginx-cert --key="private-key.pem" --cert="certificate.pem"
What version of the akv2k8s-controller are you using?
I have tried 1.1 and 1.2 (spv.no/v1 and spv.no/v2beta1)
Does the certificate exists in the keyvault before creating the
AzureKeyVaultSecret
resource in the cluster?
Yes, I also uninstalled the chart, created a "secret" placeholder with the expected output certificate name in the cluster, and installed the helm chart again without luck
Thanks for looking into it!
Nice tutorial btw :smile:
Could you try the Helm chart version 2.0.9 and akv2k8s-controller 1.2.2 (default in chart 2.0.9)?
Nice tutorial btw 😄
Could you try the Helm chart version 2.0.9 and akv2k8s-controller 1.2.2 (default in chart 2.0.9)?
I just did that unfortunately deployment/pod still crashing with the same results
E0325 08:57:35.182563 1 runtime.go:78] Observed a panic: "invalid memory address or nil pointer dereference" (runtime error: invalid memory address or nil pointer dereference)
goroutine 72 [running]:
k8s.io/apimachinery/pkg/util/runtime.logPanic(0x184ca20, 0x2713ba0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/runtime/runtime.go:74 +0x95
k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0x0, 0x0, 0x0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/runtime/runtime.go:48 +0x89
panic(0x184ca20, 0x2713ba0)
/usr/local/go/src/runtime/panic.go:969 +0x1b9
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.createNewSecretFromExisting(0xc00013c000, 0xc000110bd0, 0xc0004f5cc0, 0x0, 0x0, 0x1a87cce)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret.go:227 +0x6ef
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).getOrCreateKubernetesSecret(0xc0000fca90, 0xc00013c000, 0x2a, 0xc00013c000, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret.go:162 +0x7a9
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).syncAzureKeyVaultSecret(0xc0000fca90, 0xc000510fc0, 0x2a, 0xc00050b220, 0x455400)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/azureKeyVaultSecret.go:168 +0x67b
kmodules.xyz/client-go/tools/queue.(*Worker).processNextEntry(0xc00009e580, 0x203000)
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:84 +0xec
kmodules.xyz/client-go/tools/queue.(*Worker).processQueue(0xc00009e580)
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:67 +0x2b
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc0003ca680)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:155 +0x5f
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0003ca680, 0x1c4d980, 0xc000110b10, 0x1, 0xc00007a120)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:156 +0xad
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0003ca680, 0x3b9aca00, 0x0, 0xc000292901, 0xc00007a120)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:133 +0x98
k8s.io/apimachinery/pkg/util/wait.Until(0xc0003ca680, 0x3b9aca00, 0xc00007a120)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:90 +0x4d
created by kmodules.xyz/client-go/tools/queue.(*Worker).Run
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:53 +0x89
panic: runtime error: invalid memory address or nil pointer dereference [recovered]
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x20 pc=0x16d9eaf]
goroutine 72 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0x0, 0x0, 0x0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/runtime/runtime.go:55 +0x10c
panic(0x184ca20, 0x2713ba0)
/usr/local/go/src/runtime/panic.go:969 +0x1b9
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.createNewSecretFromExisting(0xc00013c000, 0xc000110bd0, 0xc0004f5cc0, 0x0, 0x0, 0x1a87cce)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret.go:227 +0x6ef
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).getOrCreateKubernetesSecret(0xc0000fca90, 0xc00013c000, 0x2a, 0xc00013c000, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret.go:162 +0x7a9
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).syncAzureKeyVaultSecret(0xc0000fca90, 0xc000510fc0, 0x2a, 0xc00050b220, 0x455400)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/azureKeyVaultSecret.go:168 +0x67b
kmodules.xyz/client-go/tools/queue.(*Worker).processNextEntry(0xc00009e580, 0x203000)
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:84 +0xec
kmodules.xyz/client-go/tools/queue.(*Worker).processQueue(0xc00009e580)
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:67 +0x2b
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc0003ca680)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:155 +0x5f
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc0003ca680, 0x1c4d980, 0xc000110b10, 0x1, 0xc00007a120)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:156 +0xad
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc0003ca680, 0x3b9aca00, 0x0, 0xc000292901, 0xc00007a120)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:133 +0x98
k8s.io/apimachinery/pkg/util/wait.Until(0xc0003ca680, 0x3b9aca00, 0xc00007a120)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:90 +0x4d
created by kmodules.xyz/client-go/tools/queue.(*Worker).Run
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:53 +0x89
Ok, I think I know whats happening...
When akv2k8s syncs a secret from key vault to a Kubernetes secret it adds owner references to the secret, to keep track of the secret and be able to create/update/delete the secret depending on whats happening with the AzureKeyVaultSecret
resource.
If you create the secret manually, the secret does not have owner references set to akv2k8s controller, and therefore the akv2k8s controller cannot update the secret.
If you delete the manually created Kubernetes secret, the akv2k8s controller will create it by itself and setting the correct owner refrences, on the next sync intervall.
Just tested now.. akv2k8s controller does not recover, restart is needed
If you create the secret manually, the secret does not have owner references set to akv2k8s controller, and therefore the akv2k8s controller cannot update the secret.
This was the trick, I deleted the old AzureKeyVaultSecret via kubectl delete -f
and also deleted the secret (in that order), and is now working!
SECRET NAME SYNCHED
nginx-cert 2021-03-25T09:39:19Z
Thanks much!
Released in controller 1.2.3, chart version 2.0.10 :blush: Thanks for the help @jonescobedo!
Appreciate the the quick debug. Unfortunately following @jonescobedo advice and updating to the latest controller image didn't work for me.
This was the trick, I deleted the old AzureKeyVaultSecret via kubectl delete -f and also deleted the secret (in that order), and is now working!
similar crash output as before when setting the logger to "6" which im guessing is trace level
k8s.io/apimachinery/pkg/util/runtime.logPanic(0x184da20, 0x2715ba0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/runtime/runtime.go:74 +0x95
k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0x0, 0x0, 0x0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/runtime/runtime.go:48 +0x89
panic(0x184da20, 0x2715ba0)
/usr/local/go/src/runtime/panic.go:969 +0x1b9
github.com/SparebankenVest/azure-key-vault-to-kubernetes/pkg/azure/keyvault/client.(*azureKeyVaultService).GetCertificate(0xc00006ef60, 0xc0004b2118, 0xc0000481e8, 0x0, 0x0, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/pkg/azure/keyvault/client/service.go:125 +0x2fa
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*azureCertificateHandler).HandleSecret(0xc00000c5c0, 0xc00000c5c0, 0x0, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret_handler.go:183 +0xe2
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).getSecretFromKeyVault(0xc00035eea0, 0xc0004b2000, 0x1a55143, 0x8, 0x1c4ea00)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/azureKeyVaultSecret.go:381 +0x190
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).getOrCreateKubernetesSecret(0xc00035eea0, 0xc0004b2000, 0x22, 0xc0004b2000, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret.go:116 +0xca7
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).syncAzureKeyVaultSecret(0xc00035eea0, 0xc0005280c0, 0x22, 0xc00052a0b0, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/azureKeyVaultSecret.go:168 +0x67b
kmodules.xyz/client-go/tools/queue.(*Worker).processNextEntry(0xc0004aabc0, 0x203000)
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:84 +0xec
kmodules.xyz/client-go/tools/queue.(*Worker).processQueue(0xc0004aabc0)
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:67 +0x2b
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc00006f370)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:155 +0x5f
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc00006f370, 0x1c4eb80, 0xc00049a510, 0x1, 0xc0004d80c0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:156 +0xad
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc00006f370, 0x3b9aca00, 0x0, 0x1, 0xc0004d80c0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:133 +0x98
k8s.io/apimachinery/pkg/util/wait.Until(0xc00006f370, 0x3b9aca00, 0xc0004d80c0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:90 +0x4d
created by kmodules.xyz/client-go/tools/queue.(*Worker).Run
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:53 +0x89
panic: runtime error: invalid memory address or nil pointer dereference [recovered]
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x10 pc=0x16bb73a]
goroutine 48 [running]:
k8s.io/apimachinery/pkg/util/runtime.HandleCrash(0x0, 0x0, 0x0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/runtime/runtime.go:55 +0x10c
panic(0x184da20, 0x2715ba0)
/usr/local/go/src/runtime/panic.go:969 +0x1b9
github.com/SparebankenVest/azure-key-vault-to-kubernetes/pkg/azure/keyvault/client.(*azureKeyVaultService).GetCertificate(0xc00006ef60, 0xc0004b2118, 0xc0000481e8, 0x0, 0x0, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/pkg/azure/keyvault/client/service.go:125 +0x2fa
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*azureCertificateHandler).HandleSecret(0xc00000c5c0, 0xc00000c5c0, 0x0, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret_handler.go:183 +0xe2
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).getSecretFromKeyVault(0xc00035eea0, 0xc0004b2000, 0x1a55143, 0x8, 0x1c4ea00)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/azureKeyVaultSecret.go:381 +0x190
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).getOrCreateKubernetesSecret(0xc00035eea0, 0xc0004b2000, 0x22, 0xc0004b2000, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/secret.go:116 +0xca7
github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller.(*Controller).syncAzureKeyVaultSecret(0xc00035eea0, 0xc0005280c0, 0x22, 0xc00052a0b0, 0x0)
/go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/cmd/azure-keyvault-controller/controller/azureKeyVaultSecret.go:168 +0x67b
kmodules.xyz/client-go/tools/queue.(*Worker).processNextEntry(0xc0004aabc0, 0x203000)
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:84 +0xec
kmodules.xyz/client-go/tools/queue.(*Worker).processQueue(0xc0004aabc0)
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:67 +0x2b
k8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1(0xc00006f370)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:155 +0x5f
k8s.io/apimachinery/pkg/util/wait.BackoffUntil(0xc00006f370, 0x1c4eb80, 0xc00049a510, 0x1, 0xc0004d80c0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:156 +0xad
k8s.io/apimachinery/pkg/util/wait.JitterUntil(0xc00006f370, 0x3b9aca00, 0x0, 0x1, 0xc0004d80c0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:133 +0x98
k8s.io/apimachinery/pkg/util/wait.Until(0xc00006f370, 0x3b9aca00, 0xc0004d80c0)
/go/pkg/mod/k8s.io/apimachinery@v0.20.2/pkg/util/wait/wait.go:90 +0x4d
created by kmodules.xyz/client-go/tools/queue.(*Worker).Run
/go/pkg/mod/kmodules.xyz/client-go@v0.0.0-20200521013203-6fe0a448d053/tools/queue/worker.go:53 +0x89
@jcristol Does the certificate in the Azure Key Vault have some additional configuration like disabling of Exportable Private Key?
@jcristol Can you provide some more details about the certificate in the Key Vault and the AzureKeyVaultSecret
resource?
I am pretty confident we have the ability to export private keys from the certificate and this certificate was created with a standard set of settings.
- wild card cert for *.bhceng.co
- Issuer: C=US, O=DigiCert Inc, CN=DigiCert TLS RSA SHA256 2020 CA1
- Subject: C=US, ST=Minnesota, L=Minneapolis, O=Bright Health Inc., CN=*.bhceng.co
- SAN: *.bhceng.co, bhceng.co, www.bhceng.co
Here is the output from az keyvault certificate show --id
for the cert with all of the sensitive stuff scrubbed.
{
"attributes": {
"created": "2021-01-11T22:03:15+00:00",
"enabled": true,
"expires": "2022-01-06T23:59:59+00:00",
"notBefore": "2021-01-11T00:00:00+00:00",
"recoveryLevel": "Recoverable",
"updated": "2021-01-11T22:03:15+00:00"
},
"cer": "<hidden>",
"contentType": null,
"id": "https://<hidden>.vault.azure.net/certificates/wildcard-bhceng-co/<hidden>",
"kid": "https://<hidden>.vault.azure.net/keys/wildcard-bhceng-co/<hidden>",
"name": "wildcard-bhceng-co",
"policy": null,
"sid": "https://<hidden>.vault.azure.net/secrets/wildcard-bhceng-co/<hidden>",
"tags": null,
"x509Thumbprint": "<hidden>",
"x509ThumbprintHex": "<hidden>"
}
@jcristol I'm sorry for late reply. Did you figure out some more details? Could you provide the AzureKeyVaultSecret manifest as well so I can try to reproduce a similar scenario
@181192 I personally get this error whenever I try to reference certificate from KeyVault with specific version. Maybe I do it wrong, but I haven't found any example.
Or it might be due to multiple akvs controllers and some sort of collision? There's one akvs controller outside of the my namespace, which listens on all namespaces and does not have labels. Then there is a second controller in my namespace listening on the namesapce only and managing akvs with specific labels. Might be related to that.
@181192 In our usease and in @jcristol logs here the error happens at /go/src/github.com/SparebankenVest/azure-key-vault-to-kubernetes/pkg/azure/keyvault/client/service.go:125 +0x2fa
At line 125 there's statement if !*certBundle.Policy.KeyProperties.Exportable {
The nil pointer reference is to Policy object. For some reason, if AKV obejct version is specified, Policy is null. I verified it via Azure CLI, so it's probably AKV "feature".
Quote from Azure Key Vault docs
Once a policy has been established, it isn't required with successive create operations for future versions. There's only one instance of a policy for all the versions of a Key Vault certificate.
The potential fix would be to first get certBundle without a version, extract Policy and then use certBundle with version, or simply don't check Exportable field at all and let it fail elsewhere.
Note: Make sure to check out known issues (https://akv2k8s.io/troubleshooting/known-issues/) before submitting
Components and versions Select which component(s) the bug relates to with [X].
[X ] Controller, version:
x.x.x
(docker image tag) [ ] Env-Injector (webhook), version:x.x.x
(docker image tag) [ ] OtherDescribe the bug A clear and concise description of what the bug is. When syncing a certificate following this tutorial the controller crashes while trying to get the certificate for azure.
To Reproduce Steps to reproduce the behavior: Try syncing a pem certificate signed by digicert from an azure keyvault.
Expected behavior A clear and concise description of what you expected to happen. I would expect the controller not to crash
Logs If applicable, add logs to help explain your problem.
Crash output
Additional context Add any other context about the problem here.
We noticed this happening on certificates that were previously getting synced. The last sync date for the certificate causing the crash was
2021-03-02
.We haven't updated the certificate in the keyvault nor had we touched our azure key vault controller config so I suspect its a bug or a change to Azure's api.