bitnami-labs / sealed-secrets

A Kubernetes controller and tool for one-way encrypted Secrets
Apache License 2.0
7.54k stars 675 forks source link

Broken SealedSecret causes failure in processing #505

Open bbockelm opened 3 years ago

bbockelm commented 3 years ago

We had an individual accidentally upload a broken sealed secret to the cluster (it's not quite obvious what was wrong with it); it appears that, once this secret was considered by the sealed-secrets operator the operator failed and stopped processing other items.

Here's the error message that was printed about once a second:

E0121 18:14:24.255540       1 reflector.go:123] github.com/bitnami-labs/sealed-secrets/cmd/controller/controller.go:164: \
Failed to list *v1alpha1.SealedSecret: v1alpha1.SealedSecretList.Items: \
[]v1alpha1.SealedSecret: v1alpha1.SealedSecret.Spec: v1alpha1.SealedSecretSpec.EncryptedData: \
ReadString: expects " or n, but found {, error found in #10 byte of \
  ...|etadata":{"name":"ju|..., bigger context ...|ec":{"encryptedData":{"kind":"Secret","metadata":{"name":"jupyter-hub-cookies"},"proxy.token":"AgBa1|...

(line breaks added for readability). Note the original YAML version of the object looked fine but something in the JSON representation went awry it seems.

Once this particular object was fixed the operator continued on to process everything successfully.

bbockelm commented 3 years ago

It seems to be reproducible if you just feed it any garbage data in the encryptedData. From a separate test:

spec:
  encryptedData:
    client_id: xxx
    client_secret: xxx
    metadata:
      name: test-object
      namespace: default

(literally use xxx as the encrypted data)

This is fairly frustrating as a failure in any one namespace results in the operator from working in any namespace.

smercier74 commented 3 years ago

We have the same issue

windmark commented 3 years ago

I just experienced the same issue with a single failed SealedSecret causing a blockade of all other secrets. Can this be mitigated somehow?

stefancoetzee-xneelo commented 2 years ago

Same here. Feeding it clear text, rather than encrypted data causes the controller to get stuck trying to decrypt the same sealed secret over and over, rather than marking it as failed and moving on with the rest of the sealed secrets.

burkhat commented 2 years ago

We got last week the same issue. Any updates here?

denniskaulbars commented 2 years ago

Same here. We need a fix for this asap :-/

github-actions[bot] commented 2 years ago

This Issue has been automatically marked as "stale" because it has not had recent activity (for 15 days). It will be closed if no further activity occurs. Thanks for the feedback.

bbockelm commented 2 years ago

@alvneiayu - I see you handled the latest release for 0.17.3. Do you happen to know if there's any interest in fixing this issue?

If not, that's fine - I know what to look for in the logs and now hopefully others can find out via Google too - and I can just close out the issue until later.

Many thanks!

alvneiayu commented 2 years ago

hi @bbockelm

Let me talk about this with the other maintainers and we will answer you the soon as possible. Anyway, we are open to receive PR and we will really happy to review it.

Thanks for your patience and your time.

Álvaro

audoh-tickitto commented 2 years ago

Seems a single key causes the entire thing to fail silently (unlike OP, not even an error log about a parsing error). So until you realise, you'll have outdated secrets, and then there doesn't seem to be a good way to narrow down which keys are erroneous besides putting a "TEST" key at the end of the file and then deleting keys till it appears.