1Password / connect-helm-charts

Official 1Password Helm Charts
https://developer.1password.com
MIT License
93 stars 74 forks source link

Error 500 when trying to get a secret #124

Open ankycooper opened 1 year ago

ankycooper commented 1 year ago

Your environment

Chart Version: connect-1.9.0 | APP 1.5.7

Helm Version: v3.10.2

Kubernetes Version: v1.25.4-rc4+k3s1

What happened?

Followed the blog https://blog.bennycornelissen.nl/post/onepassword-on-kubernetes/ however, I don't get the k8s secret

What did you expect to happen?

get a Kubernetes secret for the 1password secret

Steps to reproduce

  1. install via helm
    
    # Put our 1Password Connect access token in a variable
    OP_TOKEN="< paste your token here >"

Use Helm to install and configure everything. It will use the

credential file and the ${OP_TOKEN} variable to use the integration

we set up earlier.

helm upgrade --install connect 1password/connect --set-file connect.credentials=1password-credentials.json --set operator.create=true --set operator.token.value="${OP_TOKEN}" --set "operator.watchNamespace={opconnect,default}" --namespace opconnect

and add the following yaml manifest after the 1password pods are started

`apiVersion: onepassword.com/v1
kind: OnePasswordItem
metadata:
  name: password
spec:
  itemPath: "vaults/systems/items/dummy"
`

## Notes & Logs
`kubectl logs onepassworditem.onepassword.com/password
error: no kind "OnePasswordItem" is registered for version "onepassword.com/v1" in scheme "pkg/scheme/scheme.go:28"
`
The following has a 408 error
`
kubectl get onepassworditem.onepassword.com/password -o yaml
apiVersion: onepassword.com/v1
kind: OnePasswordItem
metadata:
  annotations:
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"onepassword.com/v1","kind":"OnePasswordItem","metadata":{"annotations":{},"name":"password","namespace":"default"},"spec":{"itemPath":"vaults/systems/items/dummy"}}
  creationTimestamp: "2022-11-22T02:35:26Z"
  finalizers:
  - onepassword.com/finalizer.secret
  generation: 1
  name: password
  namespace: default
  resourceVersion: "185126"
  uid: bf55e594-bb43-4c68-95ad-dd7a7ca06528
spec:
  itemPath: vaults/systems/items/dummy
status:
  conditions:
  - lastTransitionTime: "2022-11-22T02:35:27Z"
    message: 'Failed to retrieve item: status 408: sync did not complete within timeout,
      please retry the request'
    status: "False"
    type: Ready
`
Logs of the operator pod 

kubectl -n opconnect logs onepassword-connect-operator-7467685677-4j45g

{"level":"info","ts":1669084536.1768796,"logger":"controller_onepassworditem","msg":"Reconciling OnePasswordItem","Request.Namespace":"default","Request.Name":"password"} {"level":"error","ts":1669084544.651247,"logger":"controller-runtime.controller","msg":"Reconciler error","controller":"onepassworditem-controller","request":"default/password","error":"Failed to retrieve item: status 500: failed to initiate, review service logs for details","stacktrace":"github.com/go-logr/zapr.(zapLogger).Error\n\t/workspace/vendor/github.com/go-logr/zapr/zapr.go:128\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(Controller).reconcileHandler\n\t/workspace/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:258\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(Controller).processNextWorkItem\n\t/workspace/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:232\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(Controller).worker\n\t/workspace/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:211\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1\n\t/workspace/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil\n\t/workspace/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\t/workspace/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133\nk8s.io/apimachinery/pkg/util/wait.Until\n\t/workspace/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90"} {"level":"info","ts":1669084544.7315106,"logger":"controller_onepassworditem","msg":"Reconciling OnePasswordItem","Request.Namespace":"default","Request.Name":"password"}

ag-adampike commented 1 year ago

Hey @ankycooper! Thanks for filing this issue.

We encountered something similar with another customer recently, and in that case it turned out to be an Advanced Protection firewall rule for their 1Password account which was blocking Cloud Providers.

Is your Kubernetes cluster hosted on a cloud provider (e.g. AWS, Azure, Google Cloud, etc.)? If so, can you check your firewall settings to see if that is the case? It could also be another firewall rule depending on what is set there.

To confirm this is the error you're seeing, please check the logs from the connect-sync container of the Pod for 1Password Connect. You should see something like:

…
(Forbidden (Firewall Rule)), Your request was blocked by an Advanced Protection firewall rule.
…
ankycooper commented 1 year ago

It's a self hosted dev k3s cluster.

Teraform and ansible integrations work. Op cli works as well. I assume my network firewall isn't blocking anything.

I don't have a business account but a family so there is no 1password firewall.

On Tue, 22 Nov 2022, 11:40 pm Adam Pike, @.***> wrote:

Hey @ankycooper https://github.com/ankycooper! Thanks for filing this issue.

We encountered something similar with another customer recently, and in that case it turned out to be an Advanced Protection firewall rule https://support.1password.com/firewall-rules/ for their 1Password account which was blocking Cloud Providers https://support.1password.com/firewall-rules/#about-ip-addresses-and-anonymous-ips .

Is your Kubernetes cluster hosted on a cloud provider (e.g. AWS, Azure, Google Cloud, etc.)? If so, can you check your firewall settings to see if that is the case?

— Reply to this email directly, view it on GitHub https://github.com/1Password/connect-helm-charts/issues/124#issuecomment-1323618944, or unsubscribe https://github.com/notifications/unsubscribe-auth/AE5KNHG43Q3WXASG3ZLXFY3WJS5LRANCNFSM6AAAAAASHMOOIY . You are receiving this because you were mentioned.Message ID: @.***>

ankycooper commented 1 year ago

works with minikube, but fails with k3s with the error above.

ag-adampike commented 1 year ago

Thanks for the updates. It initially smelled a lot like the issue we recently encountered with another customer, which is why I jumped on it.

Thanks for the extra detail; the more information the better! It's particularly interesting that it's inconsistent behaviour across platforms, and this seems like an operator specific issue rather than 1Password Connect (i.e. the plugin, not the server itself).

All that said, I'll defer to our engineering team for a deeper investigation.

ankycooper commented 1 year ago

Thanks please let me know if you happen to get a fix.

On Thu, 24 Nov 2022, 8:54 am Adam Pike, @.***> wrote:

Thanks for the updates. It initially smelled a lot like the issue we recently encountered with another customer, which is why I jumped on it.

Thanks for the extra detail; the more information the better! It's particularly interesting that it's inconsistent behaviour across platforms, and this seems like an operator specific issue rather than 1Password Connect (i.e. the plugin, not the server itself).

All that said, I'll defer to our engineering team for a deeper investigation.

— Reply to this email directly, view it on GitHub https://github.com/1Password/connect-helm-charts/issues/124#issuecomment-1325704086, or unsubscribe https://github.com/notifications/unsubscribe-auth/AE5KNHCZ4GR4SEDPBROZH7DWJ2HCVANCNFSM6AAAAAASHMOOIY . You are receiving this because you were mentioned.Message ID: @.***>

mdnfiras commented 1 year ago

i think there is already a fix that got merged: https://github.com/1Password/connect-helm-charts/pull/108

also we got this problem and updating to chart version 1.9.0 fixed it.

ankycooper commented 1 year ago

I'm already on v1.9.0 tried v1.8.1 got a different error message (still code is 500) {"level":"info","ts":1669318119.2693477,"logger":"controller_onepassworditem","msg":"Reconciling OnePasswordItem","Request.Namespace":"default","Request.Name":"upsteam-apikey"} {"level":"error","ts":1669318127.3086646,"logger":"controller-runtime.controller","msg":"Reconciler error","controller":"onepassworditem-controller","request":"default/upsteam-apikey","error":"Failed to retrieve item: status 500: failed to initiate, review service logs for details","stacktrace":"github.com/go-logr/zapr.(*zapLogger).Error\n\t/workspace/vendor/github.com/go-logr/zapr/zapr.go:128\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\t/workspace/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:258\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\t/workspace/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:232\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\t/workspace/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:211\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1\n\t/workspace/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:155\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil\n\t/workspace/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:156\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\t/workspace/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:133\nk8s.io/apimachinery/pkg/util/wait.Until\n\t/workspace/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:90"} {"level":"info","ts":1669318128.3101885,"logger":"controller_onepassworditem","msg":"Reconciling OnePasswordItem","Request.Namespace":"default","Request.Name":"upsteam-apikey"}

jpcoenen commented 1 year ago

Apologies for the late reply here. @ankycooper, are you still running into issues? If so, could you please check the logs of the API container of the Connect Pod? It's a bit hidden, but the following line suggests Connect is not able to initialize correctly:

Failed to retrieve item: status 500: failed to initiate, review service logs for details
ankycooper commented 8 months ago

This issue is still there, tried Multiple helm versions of connect Inc. latest (connect Helm chart: 1.15.0 App: 1.7.2) -- no luck Multiple k3s versions -- no luck Multiple RKE2 Versions -- no luck Single-node/multi-node cluster -- no luck Multiple Hosting providers (AWS and self hosed-VM/baremetal) with Ubuntu and Debian -- no luck Disabled apparmor --no luck

Rancher desktop (k3s under the hood) -- works AWS Linux with k3s -- intermittently works Fedora with k3s (self-hosted)-- works

so far does not work on "debian/ubuntu"

Logs from connect pod when not working {"log_message":"(I) GET /v1/vaults?filter=title+eq+%22systems%22 completed (500: Internal Server Error) in 669ms","timestamp":"2024-03-19T11:49:24.729219254Z","level":3,"scope":{"request_id":"7de12806-0199-411a-8009-f1d1e76394e7"}} {"log_message":"(I) GET /v1/vaults?filter=title+eq+%22systems%22","timestamp":"2024-03-19T11:49:24.761845985Z","level":3,"scope":{"request_id":"f2258628-0bf1-4d4c-ae64-bcf64b760afa"}} {"log_message":"(I) notifying syncer of new token","timestamp":"2024-03-19T11:49:24.762317552Z","level":3,"scope":{"request_id":"f2258628-0bf1-4d4c-ae64-bcf64b760afa","jti":"5pjml2crowlxdwyvrdrgaxjwme"}} {"log_message":"(I) awaiting healthy syncer before continuing","timestamp":"2024-03-19T11:49:24.763981556Z","level":3,"scope":{"request_id":"f2258628-0bf1-4d4c-ae64-bcf64b760afa","jti":"5pjml2crowlxdwyvrdrgaxjwme"}} {"log_message":"(I) GET /heartbeat","timestamp":"2024-03-19T11:49:29.038462699Z","level":3,"scope":{"request_id":"f9446679-2c6c-44a7-b2d2-5664296c285f"}} {"log_message":"(I) GET /heartbeat completed (200: OK) in 1ms","timestamp":"2024-03-19T11:49:29.03910843Z","level":3,"scope":{"request_id":"f9446679-2c6c-44a7-b2d2-5664296c285f"}} {"log_message":"(I) GET /health","timestamp":"2024-03-19T11:49:29.041706581Z","level":3,"scope":{"request_id":"a6744bf5-e302-411a-b68e-61315787650d"}} {"log_message":"(I) GET /health completed (200: OK) in 2ms","timestamp":"2024-03-19T11:49:29.043975352Z","level":3,"scope":{"request_id":"a6744bf5-e302-411a-b68e-61315787650d"}} {"log_message":"(I) GET /v1/vaults?filter=title+eq+%22systems%22 completed (408: Request Timeout) in 10003ms","timestamp":"2024-03-19T11:49:34.76534844Z","level":3,"scope":{"request_id":"f2258628-0bf1-4d4c-ae64-bcf64b760afa"}} {"log_message":"(I) GET /v1/vaults?filter=title+eq+%22systems%22","timestamp":"2024-03-19T11:49:34.80400306Z","level":3,"scope":{"request_id":"c218a30a-0053-4b0e-a85c-bc3ea5e33400"}} {"log_message":"(I) notifying syncer of new token","timestamp":"2024-03-19T11:49:34.80468008Z","level":3,"scope":{"request_id":"c218a30a-0053-4b0e-a85c-bc3ea5e33400","jti":"5pjml2crowlxdwyvrdrgaxjwme"}}

logs from operator when not working 2024-03-19T11:51:22Z ERROR Reconciler error {"controller": "onepassworditem", "controllerGroup": "onepassword.com", "controllerKind": "OnePasswordItem", "OnePasswordItem": {"name":"test","namespace":"default"}, "namespace": "default", "name": "test", "reconcileID": "ae24e296-f6da-4a5c-93fd-d3f34e7db47e", "error": "Failed to retrieve item: status 408: sync did not complete within timeout, please retry the request"} sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler /workspace/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:329 sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem /workspace/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:266 sigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).Start.func2.2 /workspace/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:227

bdsoha commented 6 months ago

I am receiving the same error on a clean k3s cluster (version 1.29).