argoproj / argo-cd

Declarative Continuous Deployment for Kubernetes
https://argo-cd.readthedocs.io
Apache License 2.0
17.98k stars 5.47k forks source link

ArgoCD 1.4 Resources synchronized but still in OutOfSynch/Missing #3008

Closed rayanebel closed 4 years ago

rayanebel commented 4 years ago

Checklist:

Hello everyone,

I deployed argocd 1.4 in our Openshift cluster and I have in this cluster applications deployed with helm 3. I tried to report all of our applications into argocd. Considering that argocd is not supporting helm v3 api and after some research on my side I changed on my charts apiVersion to use v1.

After that, I've created my application on argocd and I configured it to point to my helm chart (which are in a git repository). My application is created and state is now OutOfSynch. So now, when I clicked on Synchronize, argocd does is job and return a state : OK for synchronisation but, my application still in OutOfSynch/Missing.

I tried from ui to delete some resource and synchronize again but same result.

Can someone help me to understand what's going on ? I'm really blocked and I can't use argocd.

UPDATE: When I deploy my application on the namespace where argocd is running It's working I can see my application Healthy and Synch but now, when I do the same but in ANOTHER NAMESPACE, it's not working. It's so weird.

I'm trying to deploy in my local cluster. I've created a new cluster with kubernetes.default.svc to replace the default one to setup it with a namespaced scope and I create a dedicated ServiceAccount and set in the configuration the according token. Then as I'm in openshift I give to my new ServiceAccount admin role. Finally for each namespace I create a Rolebinding which bind my admin role and serviceaccount system:serviceaccount:<NAMESPACE_OF_SA>:<DEST_NAMESPACE>

argocd is able to make all it's kubectl apply commands on this destination namespace but UI seems to not be able to fetch the resources and I don't know why. Does it a RBAC problem or something else ? What can I do to troubleshoot this problem because I'm stuck ?

To Reproduce

Expected behavior

UI should show me my application as synchronized and healthy.

Screenshots

screen1 screen2

Version

argocd: v1.4.0-rc1+5af52f6
  BuildDate: 2020-01-13T17:23:04Z
  GitCommit: 5af52f66988ad8fa0d6b977d7f5aedcdb9f5a521
  GitTreeState: clean
  GoVersion: go1.12.6
  Compiler: gc
  Platform: darwin/amd64
argocd-server: v1.4.0+2d02948
  BuildDate: 2020-01-18T05:55:02Z
  GitCommit: 2d029488aba6e5ad48b2a756bfcf43d5cb7abcee
  GitTreeState: clean
  GoVersion: go1.12.6
  Compiler: gc
  Platform: linux/amd64
  Ksonnet Version: v0.13.1
  Kustomize Version: Version: {Version:kustomize/v3.2.1 GitCommit:d89b448c745937f0cf1936162f26a5aac688f840 BuildDate:2019-09-27T00:10:52Z GoOs:linux GoArch:amd64}
  Helm Version: v2.15.2
  Kubectl Version: v1.14.0

Logs

argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=info msg="Comparing app state (cluster: https://kubernetes.default.svc, namespace: eb-portal-eu-dev-axa-ebp-fr)" application=ebportal-dev
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="https://github.com/thedigitalstudio/EBP-charts has credentials"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Generated config manifests" application=ebportal-dev
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Retrieved lived manifests" application=ebportal-dev
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="built managed objects list" application=ebportal-dev
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type bitnami.com/v1alpha1, Kind=SealedSecret: no kind \"SealedSecret\" is registered for version \"bitnami.com/v1alpha1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type bitnami.com/v1alpha1, Kind=SealedSecret: no kind \"SealedSecret\" is registered for version \"bitnami.com/v1alpha1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type bitnami.com/v1alpha1, Kind=SealedSecret: no kind \"SealedSecret\" is registered for version \"bitnami.com/v1alpha1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type apps.openshift.io/v1, Kind=DeploymentConfig: no kind \"DeploymentConfig\" is registered for version \"apps.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type apps.openshift.io/v1, Kind=DeploymentConfig: no kind \"DeploymentConfig\" is registered for version \"apps.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type route.openshift.io/v1, Kind=Route: no kind \"Route\" is registered for version \"route.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type route.openshift.io/v1, Kind=Route: no kind \"Route\" is registered for version \"route.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type route.openshift.io/v1, Kind=Route: no kind \"Route\" is registered for version \"route.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type route.openshift.io/v1, Kind=Route: no kind \"Route\" is registered for version \"route.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=info msg=syncing application=ebportal-dev isSelectiveSync=false skipHooks=false started=false syncId=00001-DZdfY

argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="tasks from hooks" application=ebportal-dev hookTasks="[]" syncId=00001-DZdfY
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'sealedsecrets' for bitnami.com/v1alpha1, Kind=SealedSecret"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'sealedsecrets' for bitnami.com/v1alpha1, Kind=SealedSecret"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'sealedsecrets' for bitnami.com/v1alpha1, Kind=SealedSecret"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'cronjobs' for batch/v1beta1, Kind=CronJob"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'deploymentconfigs' for apps.openshift.io/v1, Kind=DeploymentConfig"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'deploymentconfigs' for apps.openshift.io/v1, Kind=DeploymentConfig"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'routes' for route.openshift.io/v1, Kind=Route"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'routes' for route.openshift.io/v1, Kind=Route"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'services' for /v1, Kind=Service"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'services' for /v1, Kind=Service"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'routes' for route.openshift.io/v1, Kind=Route"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'routes' for route.openshift.io/v1, Kind=Route"

argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:37Z" level=info msg="Applying resource CronJob/eb-portal-fileloader-france-dev in cluster: https://kubernetes.default.svc, namespace: eb-portal-eu-dev-axa-ebp-fr"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:37Z" level=debug msg="{\"apiVersion\":\"batch/v1beta1\",\"kind\":\"CronJob\",\"metadata\":{\"labels\":{\"app.kubernetes.io/instance\":\"ebportal-dev\"},\"name\":\"eb-portal-fileloader-france-dev\",\"namespace\":\"eb-portal-eu-dev-axa-ebp-fr\"},\"spec\":{\"concurrencyPolicy\":\"Allow\",\"jobTemplate\":{\"spec\":{\"template\":{\"metadata\":{\"labels\":{\"parent\":\"cronjobpi\"}},\"spec\":{\"containers\":[{\"env\":[{\"name\":\"EMAIL_TO\",\"value\":\"service.digitalstudio@axa.fr\"},{\"name\":\"S3_BUCKET\",\"value\":\"s3://eb-portal-eu-dev-ods/rec\"},{\"name\":\"S3_KEY\",\"value\":\"AKIAWSR7GURUKAHJCLAA\"},{\"name\":\"S3_ROLE\",\"value\":\"arn:aws:iam::452177863784:role/eb-portal-eu-dev_appservice\"},{\"name\":\"POSTGRES_URI_SECRET\",\"valueFrom\":{\"secretKeyRef\":{\"key\":\"POSTGRES_URI_SECRET\",\"name\":\"fileloader-eb-secret-france-dev\"}}},{\"name\":\"S3_SECRET\",\"valueFrom\":{\"secretKeyRef\":{\"key\":\"S3_SECRET\",\"name\":\"fileloader-eb-secret-france-dev\"}}},{\"name\":\"HTTP_PROXY\",\"value\":\"http://proxy:8080\"},{\"name\":\"HTTPS_PROXY\",\"value\":\"http://proxy:8080\"},{\"name\":\"http_proxy\",\"value\":\"http://proxy:8080\"},{\"name\":\"https_proxy\",\"value\":\"http://proxy:8080\"},{\"name\":\"NODE_ENV\",\"value\":\"production\"},{\"name\":\"SMTP_HOST\",\"value\":\"10.142.76.184\"},{\"name\":\"SMTP_PORT\",\"value\":\"25\"}],\"image\":\"docker.io/thedigitalstudio/france-file-loader:2.0.0\",\"imagePullPolicy\":\"Always\",\"name\":\"eb-portal-fileloader-france-dev\",\"resources\":{\"limits\":{\"cpu\":\"1\",\"memory\":\"1536Mi\"}},\"terminationMessagePath\":\"/dev/termination-log\"}],\"dnsPolicy\":\"ClusterFirst\",\"restartPolicy\":\"OnFailure\",\"securityContext\":{},\"terminationGracePeriodSeconds\":30}}}},\"schedule\":\"45 2 * * *\",\"suspend\":false}}"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:39Z" level=debug msg=applying application=ebportal-dev dryRun=true syncId=00001-DZdfY task="Sync/0 resource bitnami.com/SealedSecret:eb-portal-eu-dev-axa-ebp-fr/core-eb-secret-france-dev nil->obj (,,)"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:39Z" level=info msg="Applying resource SealedSecret/core-eb-secret-france-dev in cluster: https://kubernetes.default.svc, namespace: eb-portal-eu-dev-axa-ebp-fr"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:39Z" level=debug msg=applying application=ebportal-dev dryRun=true syncId=00001-DZdfY task="Sync/0 resource bitnami.com/SealedSecret:eb-portal-eu-dev-axa-ebp-fr/connector-eb-secret-france-dev nil->obj (,,)"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:39Z" level=info msg="Applying resource SealedSecret/connector-eb-secret-france-dev in cluster: https://kubernetes.default.svc, namespace: eb-portal-eu-dev-axa-ebp-fr"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch Ingress.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch Ingress.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch PodDisruptionBudget.policy on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch PodDisruptionBudget.policy on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch HorizontalPodAutoscaler.autoscaling on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch HorizontalPodAutoscaler.autoscaling on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch ServiceAccount on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch ServiceAccount on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch CronJob.batch on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch CronJob.batch on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch Route.route.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch Route.route.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch ReplicationController on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch ReplicationController on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch Deployment.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch Deployment.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch StatefulSet.apps on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch StatefulSet.apps on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch RoleBinding.rbac.authorization.k8s.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch RoleBinding.rbac.authorization.k8s.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch ReplicaSet.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch ReplicaSet.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch ImageStream.image.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch ImageStream.image.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch DeploymentConfig.apps.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch DeploymentConfig.apps.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch Job.batch on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch Job.batch on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
ichtestemalwieder commented 4 years ago

Most likely I have the same problem. I set up a new app with manual syncing and after the first run i got the same result: It is succesfully synced, but with OutOfSync:

image

Events: image

But an important note: After clicking manually "sync" now everything is OK. So now it is working for me, but the initial behaviour is very irritating. Thanks.

Update: after the comment from paulcrecan: The same happened to me (but I didnt write it here): After installing argocd, nothing was workgint at all regarding syncing. I then restarted the argocd-application-controller too. Then it was working as state as above. So there really seems to be in issue...

paulcrecan commented 4 years ago

+1. I have the same problem as Ryan presented. One more thing to add is that when restarting the argocd-application-controller the applications are succesfully syncing.

paulcrecan commented 4 years ago

Unfortunately this fix did not solve the issue presented above. Even if we upgraded to argocd:v1.4.2 version the issue persists.

alexmt commented 4 years ago

Hello @paulcrecan ,

Can you please describe how did you reproduce the issue?

isabo-lola commented 4 years ago

Hi @alexmt . Paul and I are working on the same team/project. We're currently running OKD v3.11 and running Kubernetes v1.11 . We've noticed several resources have inconsistent behaviour (mostly PVCs, Routes and Bitnami/SealedSecrets) and they appear to be out of sync. In the ArgoCD UI the resources seem like they aren't present in the cluster at all, so not just a diffing customization issue, but upon a closer inspection both ArgoCD and Openshift recognize the resources are present/synced.

As @paulcrecan mentioned, if we restart the argocd-application-controller pod, the resources will be fully in sync for a short period of time (under an hour) regardless if the application was just instantiated or it was present in Argo before. We've also increased the limits on the controller but to no avail. Our current config for the pod is as follows:

`- argocd-application-controller

If any other details are necessary, please let me know!

alexmt commented 4 years ago

Sorry @isabo-lola , @paulcrecan . I did not notice your comment and forgot to follow up. Are you still experiencing this issue? It might be much easier to sync up in slack.

jessesuen commented 4 years ago

Should be fixed with later Argo CD v1.6+

lunyi commented 3 years ago

hi @jessesuen
I use v1.7.6 version, and still have the same issue. Do I need to update something ? image

mabushey commented 3 years ago

Had this issue in 1.7.10, but it turned out to be caused by a startupProbe config in the deployment which is not available until a K8s 1.17 -> 1.18 upgrade. It looks like kubectl quietly ignored this, while argoCD noticed the difference.

danilorsilva commented 3 years ago

Same problem here using ArgoCD 1.7.6.

Any news about how to solve this problem?

Thanks

danilorsilva commented 3 years ago

Solved.

Added the namespace i want to argocd manage on the secret that argocd uses to know what namespace can managed.

veerendra2 commented 3 years ago

I have same problem in v1.8.7. We observed this issue after we upgrade Argocd version from v1.7 to v1.8.

oc v3.11.0+0cbc58b
kubernetes v1.11.0+d4cacc0

Below is the example deployment file

kind: "DeploymentConfig"
apiVersion: "apps.openshift.io/v1"
metadata:
  name: "frontend"
spec:
  template:
    metadata:
      labels:
        name: "frontend"
    spec:
      containers:
        - name: "helloworld"
          image: "openshift/hello-openshift"
          ports:
            - containerPort: 8080
              protocol: "TCP"
  replicas: 1
  triggers:
    - type: "ConfigChange"
  strategy:
    type: "Rolling"
  paused: false
  revisionHistoryLimit: 2
  minReadySeconds: 0
trungklam commented 3 years ago

I'm having exactly this issue. I'm not quite sure can we use the diff customization to just ignore the startupProbe ?

sunshine69 commented 2 years ago

I have the similar issues with v2.1.6+a346cf9

aslafy-z commented 2 years ago

Similar issue with v2.2.1+122ecef, although sometimes I have to wait for few minutes after the (manual) sync finished before seeing the resources as Synced (even if the diffs shows nothing). This is only for remote clusters.

sunshine69 commented 2 years ago

I have the similar issues with v2.1.6+a346cf9

I fixed the problem, mostly is around k8s ha scaler re-ordering which has been mentioned somewhere I found when googling - that is k8s auto re-range the condition based on key.

Another time happened is the mismatch between docker tag and image tag, so it is not really argocd problem (in my case)

ybialik commented 2 years ago

does anyone got a fix for this?

we are seeing our deployments synced on kubernetes after a tag change but argo ui is showing out of sync for a minute or so and then gets synced

sunshine69 commented 2 years ago

does anyone got a fix for this?

we are seeing our deployments synced on kubernetes after a tag change but argo ui is showing out of sync for a minute or so and then gets synced

I think you better check if there is anything outside argocd trying to update the live k8s object which makes it unsynced. I have just experienced one more thing (in addition to the HPA ordering issue above) that is the resourceVersion - if this is (wrongly) set in the metadata section of a service, when deploy k8s will auto change the value to be different one and argocd will keep syncing forever (well until it gives up)

sunshine69 commented 2 years ago

One way to check is to view the live manifest and desired manifest in argocd ui, and the diff.

yuezhuangshi commented 2 years ago

I'm having exactly this issue. I'm not quite sure can we use the diff customization to just ignore the startupProbe ?

I also encountered this problem recently, so I am posting it for your reference.

According to Diffing Customization, we can add the following diffing customization in argocd-cm ConfigMap:

# in my case to ignore all startupProbe in statefulset at system-level
resource.customizations.ignoreDifferences.apps_StatefulSet: |
  jqPathExpressions:
    - '.spec.template.spec.containers[]?.startupProbe'
maggie44 commented 10 months ago

One way to check is to view the live manifest and desired manifest in argocd ui, and the diff.

I think that is the real issue, there isn't any indication as to why it isn't syncing. If there is a startup probe issue, or docker container, or any of the other potential conflicts people have mentioned above then ArgoCD should give some indication as to what it is (v2.9.3+6eba5be).

Screenshot 2024-01-15 at 15 30 40

In my case I can see it is timestamps:

creationTimestamp: '2024-01-15T15:26:40Z'
  generation: 1

But I had to look at the Live and Desired and compare them myself to notice it.