vmware-tanzu / velero

Backup and migrate Kubernetes applications and their persistent volumes
https://velero.io
Apache License 2.0
8.71k stars 1.4k forks source link

Backups are failing consistently. #6100

Closed nwakalka closed 11 months ago

nwakalka commented 1 year ago

What steps did you take and what happened:

We are trying to run velero backups for cluster resources but backups are failing.

[root@***]# velero_pod=$(oc get po -n mcs-backup --no-headers -l component=mcs-velero| awk '{print $1}'); oc exec -n mcs-backup "$velero_pod" -- /velero backup describe backuptest --details --insecure-skip-tls-verify
Defaulted container "mcs-velero" out of: mcs-velero, mcs-velero-plugin-for-aws (init), mcs-velero-plugin-for-azure (init), mcs-plugin-ocp (init)

Name:         backuptest
Namespace:    mcs-backup
Labels:       velero.io/storage-location=internal
Annotations:  velero.io/source-cluster-k8s-gitversion=v1.24.0+3882f8f
              velero.io/source-cluster-k8s-major-version=1
              velero.io/source-cluster-k8s-minor-version=24

Phase:  Failed (run `velero backup logs backuptest` for more information)

Errors:    0
Warnings:  0

Namespaces:
  Included:  *
  Excluded:  mcs-lifecycle-check

Resources:
  Included:        *
  Excluded:        <none>
  Cluster-scoped:  auto

Label selector:  <none>

Storage Location:  internal

Velero-Native Snapshot PVs:  auto

TTL:  720h0m0s

Hooks:  <none>

Backup Format Version:  1.1.0

Started:    2023-04-05 09:36:58 +0000 UTC
Completed:  2023-04-05 09:40:52 +0000 UTC

Expiration:  2023-05-05 09:36:58 +0000 UTC

Total items to be backed up:  14835
Items backed up:              14835

Resource List:  <backup resource list not found>

Velero-Native Snapshots: <none included>

here you can find backup logs for more info

[root]# velero_pod=$(oc get po -n mcs-backup --no-headers -l component=mcs-velero| awk '{print $1}'); oc exec -n mcs-backup "$velero_pod" -- /velero backup logs backuptest --insecure-skip-tls-verify | grep -Ei "fail|warn|error"
Defaulted container "mcs-velero" out of: mcs-velero, mcs-velero-plugin-for-aws (init), mcs-velero-plugin-for-azure (init), mcs-plugin-ocp (init)
I0405 09:53:51.459514    1694 request.go:601] Waited for 1.184274562s due to client-side throttling, not priority and fairness, request: GET:https://172.18.0.1:443/apis/kafka.strimzi.io/v1alpha1?timeout=32s
time="2023-04-05T09:36:58Z" level=debug msg="Checking for AWS specific error information" backup=mcs-backup/backuptest bucket=ocp501-k99rp-backup-internal cmd=/plugins/velero-plugin-for-aws key=backup-v2/backups/backuptest/velero-backup.json logSource="/go/src/github.com/vmware-tanzu/velero-plugin-for-aws/velero-plugin-for-aws/object_store.go:368" pluginName=velero-plugin-for-aws
time="2023-04-05T09:36:58Z" level=debug msg="awserr.Error contents (origErr=<nil>)" backup=mcs-backup/backuptest bucket=ocp501-k99rp-backup-internal cmd=/plugins/velero-plugin-for-aws code=NotFound key=backup-v2/backups/backuptest/velero-backup.json logSource="/go/src/github.com/vmware-tanzu/velero-plugin-for-aws/velero-plugin-for-aws/object_store.go:375" message="Not Found" pluginName=velero-plugin-for-aws
time="2023-04-05T09:38:15Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-2e819bf0-f7d9-4503-b124-f57cdfaeee22 namespace= persistentVolume=pvc-2e819bf0-f7d9-4503-b124-f57cdfaeee22 resource=persistentvolumes
time="2023-04-05T09:38:16Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-b48c1d60-c041-42b3-a2ed-576a630b5ab9 namespace= persistentVolume=pvc-b48c1d60-c041-42b3-a2ed-576a630b5ab9 resource=persistentvolumes
time="2023-04-05T09:38:16Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-584468ce-1209-482f-aa72-1be647f00973 namespace= persistentVolume=pvc-584468ce-1209-482f-aa72-1be647f00973 resource=persistentvolumes
time="2023-04-05T09:38:16Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-6c4f9d39-112d-4e58-9cd0-2b51f1ba7c81 namespace= persistentVolume=pvc-6c4f9d39-112d-4e58-9cd0-2b51f1ba7c81 resource=persistentvolumes
time="2023-04-05T09:38:16Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-26a2fbfb-945a-405b-95d5-7b1c48ee8efc namespace= persistentVolume=pvc-26a2fbfb-945a-405b-95d5-7b1c48ee8efc resource=persistentvolumes
time="2023-04-05T09:38:16Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-5a8c5974-c922-440a-890d-c6b48d9d9ab8 namespace= persistentVolume=pvc-5a8c5974-c922-440a-890d-c6b48d9d9ab8 resource=persistentvolumes
time="2023-04-05T09:38:16Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-7ae4b321-596f-4b56-8dac-6a8623127460 namespace= persistentVolume=pvc-7ae4b321-596f-4b56-8dac-6a8623127460 resource=persistentvolumes
time="2023-04-05T09:38:16Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-79e96b38-06d2-41e5-8795-84d338aebe35 namespace= persistentVolume=pvc-79e96b38-06d2-41e5-8795-84d338aebe35 resource=persistentvolumes
time="2023-04-05T09:38:16Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-daa3398a-bb2f-43bd-9958-98b5e0d88777 namespace= persistentVolume=pvc-daa3398a-bb2f-43bd-9958-98b5e0d88777 resource=persistentvolumes
time="2023-04-05T09:38:16Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-744ec9aa-993a-4b31-a351-395803915010 namespace= persistentVolume=pvc-744ec9aa-993a-4b31-a351-395803915010 resource=persistentvolumes
time="2023-04-05T09:38:16Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-4f659538-e432-480f-b944-1d6c66202a28 namespace= persistentVolume=pvc-4f659538-e432-480f-b944-1d6c66202a28 resource=persistentvolumes
time="2023-04-05T09:38:17Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-60267ee0-f967-4bc5-b6b0-e8e58d94f852 namespace= persistentVolume=pvc-60267ee0-f967-4bc5-b6b0-e8e58d94f852 resource=persistentvolumes
time="2023-04-05T09:38:17Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-cc955e31-aaa3-4f62-b264-5a8b47599e86 namespace= persistentVolume=pvc-cc955e31-aaa3-4f62-b264-5a8b47599e86 resource=persistentvolumes
time="2023-04-05T09:38:17Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-a00a6a14-edb7-4fb8-b274-8519c9ed2a3d namespace= persistentVolume=pvc-a00a6a14-edb7-4fb8-b274-8519c9ed2a3d resource=persistentvolumes
time="2023-04-05T09:38:17Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-574281bd-1bcc-43fb-a1da-ad026f9f9659 namespace= persistentVolume=pvc-574281bd-1bcc-43fb-a1da-ad026f9f9659 resource=persistentvolumes
time="2023-04-05T09:38:17Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-4c02ad30-fef3-4ffb-851f-4f277a38fa84 namespace= persistentVolume=pvc-4c02ad30-fef3-4ffb-851f-4f277a38fa84 resource=persistentvolumes
time="2023-04-05T09:38:17Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-302f34d1-15e8-4bee-93f0-83ed66ae990b namespace= persistentVolume=pvc-302f34d1-15e8-4bee-93f0-83ed66ae990b resource=persistentvolumes
time="2023-04-05T09:38:17Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-6d059296-a53a-47d8-b2d1-3ff27cc51736 namespace= persistentVolume=pvc-6d059296-a53a-47d8-b2d1-3ff27cc51736 resource=persistentvolumes
time="2023-04-05T09:38:17Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-5b863d0d-bdda-4c0b-8b3d-445f8abe2dd3 namespace= persistentVolume=pvc-5b863d0d-bdda-4c0b-8b3d-445f8abe2dd3 resource=persistentvolumes
time="2023-04-05T09:38:17Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-c1ca4919-e43b-4a2e-b923-5db235f338c8 namespace= persistentVolume=pvc-c1ca4919-e43b-4a2e-b923-5db235f338c8 resource=persistentvolumes
time="2023-04-05T09:38:17Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-c9fe179c-6802-4ba3-ad21-08057d8c8694 namespace= persistentVolume=pvc-c9fe179c-6802-4ba3-ad21-08057d8c8694 resource=persistentvolumes
time="2023-04-05T09:38:18Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-ca76827a-8f09-4e93-8548-18b38cb81bab namespace= persistentVolume=pvc-ca76827a-8f09-4e93-8548-18b38cb81bab resource=persistentvolumes
time="2023-04-05T09:38:18Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-8ca092bc-85b9-469a-8872-fb18cc9fcdb7 namespace= persistentVolume=pvc-8ca092bc-85b9-469a-8872-fb18cc9fcdb7 resource=persistentvolumes
time="2023-04-05T09:38:18Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-67735000-7dc6-4f04-9a4b-a46ec12a228d namespace= persistentVolume=pvc-67735000-7dc6-4f04-9a4b-a46ec12a228d resource=persistentvolumes
time="2023-04-05T09:38:18Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-8bf38425-9be1-47cc-b852-68924e5c6d77 namespace= persistentVolume=pvc-8bf38425-9be1-47cc-b852-68924e5c6d77 resource=persistentvolumes
time="2023-04-05T09:38:18Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-a70fe194-e2cf-474a-9e9e-f02d35fbe67c namespace= persistentVolume=pvc-a70fe194-e2cf-474a-9e9e-f02d35fbe67c resource=persistentvolumes
time="2023-04-05T09:38:18Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-9adcd289-45d8-422e-8a2b-17764a93f15e namespace= persistentVolume=pvc-9adcd289-45d8-422e-8a2b-17764a93f15e resource=persistentvolumes
time="2023-04-05T09:38:18Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-916a477a-c807-4007-beaa-05e21752e5bb namespace= persistentVolume=pvc-916a477a-c807-4007-beaa-05e21752e5bb resource=persistentvolumes
time="2023-04-05T09:38:18Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-45590a14-c614-4293-88a1-0f573092d9be namespace= persistentVolume=pvc-45590a14-c614-4293-88a1-0f573092d9be resource=persistentvolumes
time="2023-04-05T09:38:18Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-10f50765-0e88-4e13-9be2-e32941d52010 namespace= persistentVolume=pvc-10f50765-0e88-4e13-9be2-e32941d52010 resource=persistentvolumes
time="2023-04-05T09:38:18Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-7bed734b-4884-41d1-a991-1678fcb0b6d1 namespace= persistentVolume=pvc-7bed734b-4884-41d1-a991-1678fcb0b6d1 resource=persistentvolumes
time="2023-04-05T09:38:19Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-5c75ec35-ac79-4f60-8e71-00965958a49b namespace= persistentVolume=pvc-5c75ec35-ac79-4f60-8e71-00965958a49b resource=persistentvolumes
time="2023-04-05T09:38:19Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-1db9f493-979d-4330-9ad6-5564d11a911f namespace= persistentVolume=pvc-1db9f493-979d-4330-9ad6-5564d11a911f resource=persistentvolumes
time="2023-04-05T09:38:19Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-f18eab27-aa84-447d-96c2-ec0665899ecd namespace= persistentVolume=pvc-f18eab27-aa84-447d-96c2-ec0665899ecd resource=persistentvolumes
time="2023-04-05T09:38:20Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-05c7a85d-f3e7-401b-aee0-48c6e3cb03d7 namespace= persistentVolume=pvc-05c7a85d-f3e7-401b-aee0-48c6e3cb03d7 resource=persistentvolumes
time="2023-04-05T09:38:24Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-56d70ec9-ed9a-4107-933a-20bc23924ee8 namespace= persistentVolume=pvc-56d70ec9-ed9a-4107-933a-20bc23924ee8 resource=persistentvolumes
time="2023-04-05T09:38:33Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-1943e908-78ed-47af-91d3-3796ef2fdc6e namespace= persistentVolume=pvc-1943e908-78ed-47af-91d3-3796ef2fdc6e resource=persistentvolumes
time="2023-04-05T09:38:33Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-0dfe7ba2-8306-4b50-88ab-9efd47e20f7c namespace= persistentVolume=pvc-0dfe7ba2-8306-4b50-88ab-9efd47e20f7c resource=persistentvolumes
time="2023-04-05T09:38:33Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-1cc5e393-42fd-4231-8e96-b5c948dad79b namespace= persistentVolume=pvc-1cc5e393-42fd-4231-8e96-b5c948dad79b resource=persistentvolumes
time="2023-04-05T09:38:33Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-6a61055c-8318-4b70-aec2-7535db881bf8 namespace= persistentVolume=pvc-6a61055c-8318-4b70-aec2-7535db881bf8 resource=persistentvolumes
time="2023-04-05T09:38:33Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-fd97b652-4b8e-42da-9ea0-9927f3578c09 namespace= persistentVolume=pvc-fd97b652-4b8e-42da-9ea0-9927f3578c09 resource=persistentvolumes
time="2023-04-05T09:38:33Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-666ca942-d441-4b70-9a85-fbc188762686 namespace= persistentVolume=pvc-666ca942-d441-4b70-9a85-fbc188762686 resource=persistentvolumes
time="2023-04-05T09:38:33Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-4f004f7f-406d-4934-81c8-2268c9503086 namespace= persistentVolume=pvc-4f004f7f-406d-4934-81c8-2268c9503086 resource=persistentvolumes
time="2023-04-05T09:38:33Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-8cc90e1d-7bf3-4d98-98bc-0c70fbc404ce namespace= persistentVolume=pvc-8cc90e1d-7bf3-4d98-98bc-0c70fbc404ce resource=persistentvolumes
time="2023-04-05T09:38:33Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-2cf9dba4-bda7-4f05-993d-4fcc27f37f36 namespace= persistentVolume=pvc-2cf9dba4-bda7-4f05-993d-4fcc27f37f36 resource=persistentvolumes
time="2023-04-05T09:38:33Z" level=info msg="label \"failure-domain.beta.kubernetes.io/zone\" is not present on PersistentVolume" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:453" name=pvc-83055340-7002-49e1-a33f-b01cf3afa43e namespace= persistentVolume=pvc-83055340-7002-49e1-a33f-b01cf3afa43e resource=persistentvolumes
time="2023-04-05T09:40:50Z" level=info msg="Processing item" backup=mcs-backup/backuptest logSource="pkg/backup/backup.go:340" name=audit-errors namespace=openshift-kube-apiserver progress= resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=info msg="Backing up item" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:133" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Executing pre hooks" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:135" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Skipping action because it does not apply to this resource" backup=mcs-backup/backuptest logSource="pkg/plugin/framework/action_resolver.go:58" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Skipping action because it does not apply to this resource" backup=mcs-backup/backuptest logSource="pkg/plugin/framework/action_resolver.go:58" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Skipping action because it does not apply to this resource" backup=mcs-backup/backuptest logSource="pkg/plugin/framework/action_resolver.go:58" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Skipping action because it does not apply to this resource" backup=mcs-backup/backuptest logSource="pkg/plugin/framework/action_resolver.go:58" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Skipping action because it does not apply to this resource" backup=mcs-backup/backuptest logSource="pkg/plugin/framework/action_resolver.go:58" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Skipping action because it does not apply to this resource" backup=mcs-backup/backuptest logSource="pkg/plugin/framework/action_resolver.go:58" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Skipping action because it does not apply to this resource" backup=mcs-backup/backuptest logSource="pkg/plugin/framework/action_resolver.go:58" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Skipping action because it does not apply to this resource" backup=mcs-backup/backuptest logSource="pkg/plugin/framework/action_resolver.go:58" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Executing post hooks" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:220" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=debug msg="Resource prometheusrules.monitoring.coreos.com/audit-errors, version= v1, preferredVersion=v1" backup=mcs-backup/backuptest logSource="pkg/backup/item_backupper.go:271" name=audit-errors namespace=openshift-kube-apiserver resource=prometheusrules.monitoring.coreos.com
time="2023-04-05T09:40:50Z" level=info msg="Backed up 14417 items out of an estimated total of 14802 (estimate will change throughout the backup)" backup=mcs-backup/backuptest logSource="pkg/backup/backup.go:380" name=audit-errors namespace=openshift-kube-apiserver progress= resource=prometheusrules.monitoring.coreos.com

please find attached velero pod logs for more info

Backup_logs.txt

Our BackupStorageLocation is available and able to connect to our OBS buckets.

[root]# oc get backups.velero.io -A
NAMESPACE    NAME         AGE
mcs-backup   backuptest   65m

[root]# oc get backupstoragelocations.velero.io -A
NAMESPACE    NAME       PHASE       LAST VALIDATED   AGE   DEFAULT
mcs-backup   internal   Available   22s              71m   true

What did you expect to happen:

The following information will help us better understand what's going on:

If you are using velero v1.7.0+:
Please use velero debug --backup <backupname> --restore <restorename> to generate the support bundle, and attach to this issue, more options please refer to velero debug --help

If you are using earlier versions:
Please provide the output of the following commands (Pasting long output into a GitHub gist or other pastebin is fine.)

Anything else you would like to add:

Environment:

Vote on this issue!

This is an invitation to the Velero community to vote on issues, you can see the project's top voted issues listed here.
Use the "reaction smiley face" up to the right of this comment to vote.

qiuming-best commented 1 year ago

@nwakalka the logs you provided are incomplete, could you generate the debug bundle with the command velero debug --backup $backupName and uploaded it in this issue?

nwakalka commented 1 year ago

Hi @qiuming-best ,

We are facing an error while getting debug bundle.

nikhil@nikhil-virtual-machine:~/Downloads/SFS/nwakalka$ oc get deployments -n mcs-backup
NAME              READY     UP-TO-DATE   AVAILABLE   AGE
mcs-velero        1/1       1            1           49d
s3-backup-proxy   3/3       3            3           49d

nikhil@nikhil-virtual-machine:~/Downloads/SFS/nwakalka$ kubectl exec -it mcs-velero-576f94d5cb-dpwhc  -n mcs-backup -- /velero debug --backup ddaily-backup-20230413020006
Defaulted container "mcs-velero" out of: mcs-velero, mcs-velero-plugin-for-aws (init), mcs-velero-plugin-for-azure (init), mcs-plugin-ocp (init)
An error occurred: velero deployment does not exist in namespace: mcs-backup
command terminated with exit code 1

so I changed the deployment from mcs-velero to velero but still facing the same error


nikhil@nikhil-virtual-machine:~/Downloads/SFS/nwakalka$ kubectl get deployments -n mcs-backup
NAME              READY   UP-TO-DATE   AVAILABLE   AGE
mcs-velero        0/0     0            0           49d
s3-backup-proxy   3/3     3            3           49d
velero            1/1     1            1           2m50s
nikhil@nikhil-virtual-machine:~/Downloads/SFS/nwakalka$ kubectl get deployments -n mcs-backup velero 
NAME     READY   UP-TO-DATE   AVAILABLE   AGE
velero   1/1     1            1           3m11s
nikhil@nikhil-virtual-machine:~/Downloads/SFS/nwakalka$ kubectl exec -it velero-7c96c885dd-2k26s  -n mcs-backup -- /velero debug --backup daily-backup-20230413020006
Defaulted container "velero" out of: velero, mcs-velero-plugin-for-aws (init), mcs-velero-plugin-for-azure (init), mcs-plugin-ocp (init)
An error occurred: velero deployment does not exist in namespace: mcs-backup
command terminated with exit code 1
qiuming-best commented 1 year ago

@nwakalka it's no need to execute this command in pod, what you need it's just using velero cli to execute in your local machine.

GVOROS commented 1 year ago

@qiuming-best We tried with the velero CLI also, but we got the same issue:

# ./velero-v1.9.3-linux-amd64 -n mcs-backup debug --backup backuptest
An error occurred: velero deployment does not exist in namespace: mcs-backup
# ./velero-v1.9.3-linux-amd64  debug --backup backuptest
An error occurred: velero deployment does not exist in namespace: mcs-backup
# oc get deployment -n mcs-backup
NAME              READY   UP-TO-DATE   AVAILABLE   AGE
mcs-velero        1/1     1            1           8d
s3-backup-proxy   2/2     2            2           8d
velero            1/1     1            1           35s
# ./velero-v1.9.3-linux-amd64  debug --backup backuptest
An error occurred: velero deployment does not exist in namespace: mcs-backup
# ./velero-v1.9.3-linux-amd64  debug --backup backuptest --verbose
An error occurred: velero deployment does not exist in namespace: mcs-backup
# ./velero-v1.9.3-linux-amd64  debug --backup backuptest -v 9
An error occurred: velero deployment does not exist in namespace: mcs-backup

We also tried to modify the name of the deployment, but did not help.

qiuming-best commented 1 year ago

@GVOROS your correct generating bundle command should be ./velero-v1.9.3-linux-amd64 debug --backup backuptest

GVOROS commented 1 year ago

@qiuming-best

# ./velero-v1.9.3-linux-amd64 debug --backup backuptest
An error occurred: velero deployment does not exist in namespace: mcs-backup
nwakalka commented 1 year ago

Hi @qiuming-best

I was wondering if there has been any progress or updates on the issue? I understand that you may be busy, but I would greatly appreciate any information you could provide.

Thank you in advance.

qiuming-best commented 1 year ago

@nwakalka Sorry for the late response, you need to add a namespace in the command, the complete command would be ./velero-v1.9.3-linux-amd64 debug --backup backuptest -n velero

GVOROS commented 1 year ago

@qiuming-best We do not have velero namespace. As you can see we neither have deployment called velero, instead we have mcs-velero. Can it be a problem?

# ./velero-v1.9.3-linux-amd64 debug --backup backuptest -n velero
An error occurred: velero deployment does not exist in namespace: velero  
# ./velero-v1.9.3-linux-amd64 debug --backup backuptest -n mcs-backup
An error occurred: velero deployment does not exist in namespace: mcs-backup

# oc get deployment -n mcs-backup | grep velero
mcs-velero        1/1     1            1           46d
GVOROS commented 1 year ago

@qiuming-best May I ask if you have any updates?

GVOROS commented 1 year ago

Hi @qiuming-best, Do you have any news?

github-actions[bot] commented 1 year ago

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 14 days. If a Velero team member has requested log or more information, please provide the output of the shared commands.

GVOROS commented 1 year ago

Anyone can help us?

github-actions[bot] commented 11 months ago

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 14 days. If a Velero team member has requested log or more information, please provide the output of the shared commands.

github-actions[bot] commented 11 months ago

This issue was closed because it has been stalled for 14 days with no activity.