airflow-helm / charts

The User-Community Airflow Helm Chart is the standard way to deploy Apache Airflow on Kubernetes with Helm. Originally created in 2017, it has since helped thousands of companies create production-ready deployments of Airflow on Kubernetes.
https://github.com/airflow-helm/charts/tree/main/charts/airflow
Apache License 2.0
661 stars 475 forks source link

cannot patch "airflow-upgrade-db" with kind Job: Job.batch "airflow-upgrade-db" is invalid #99

Closed Yinabled closed 3 years ago

Yinabled commented 3 years ago

What is the bug?

When trying to modify our Airflow deployment using the latest chart via Terraform, we consistently run into the following error:

Error: cannot patch "airflow-upgrade-db" with kind Job: Job.batch "airflow-upgrade-db" is invalid: spec.template: Invalid value: core.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"airflow", "component":"jobs", "controller-uid":"df138fad-f51c-4ed6-a01b-553c39731356", "job-name":"airflow-upgrade-db", "release":"airflow"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:core.PodSpec{Volumes:[]core.Volume(nil), InitContainers:[]core.Container{core.Container{Name:"check-db", Image:"apache/airflow:2.0.1-python3.8", Command:[]string{"/usr/bin/dumb-init", "--"}, Args:[]string{"bash", "-c", "exec airflow db check"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource{core.EnvFromSource{Prefix:"", ConfigMapRef:(*core.ConfigMapEnvSource)(nil), SecretRef:(*core.SecretEnvSource)(0xc02223d7e0)}}, Env:[]core.EnvVar{core.EnvVar{Name:"DATABASE_PASSWORD", Value:"", ValueFrom:(*core.EnvVarSource)(0xc02223d820)}, core.EnvVar{Name:"REDIS_PASSWORD", Value:"", ValueFrom:(*core.EnvVarSource)(0xc02223d840)}, core.EnvVar{Name:"PYTHONPATH", Value:"/opt/python/site-packages:/opt/airflow/dags/repo", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil)}, VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(0xc006362730), Stdin:false, StdinOnce:false, TTY:false}}, Containers:[]core.Container{core.Container{Name:"upgrade-db", Image:"apache/airflow:2.0.1-python3.8", Command:[]string{"/usr/bin/dumb-init", "--"}, Args:[]string{"bash", "-c", "exec airflow db upgrade"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource{core.EnvFromSource{Prefix:"", ConfigMapRef:(*core.ConfigMapEnvSource)(nil), SecretRef:(*core.SecretEnvSource)(0xc02223d9a0)}}, Env:[]core.EnvVar{core.EnvVar{Name:"DATABASE_PASSWORD", Value:"", ValueFrom:(*core.EnvVarSource)(0xc02223d9e0)}, core.EnvVar{Name:"REDIS_PASSWORD", Value:"", ValueFrom:(*core.EnvVarSource)(0xc02223da00)}, core.EnvVar{Name:"PYTHONPATH", Value:"/opt/python/site-packages:/opt/airflow/dags/repo", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil)}, VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(0xc006362780), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]core.EphemeralContainer(nil), RestartPolicy:"OnFailure", TerminationGracePeriodSeconds:(*int64)(0xc023915170), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", SecurityContext:(*core.PodSecurityContext)(0xc02175c380), ImagePullSecrets:[]core.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*core.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]core.Toleration(nil), HostAliases:[]core.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), PreemptionPolicy:(*core.PreemptionPolicy)(nil), DNSConfig:(*core.PodDNSConfig)(nil), ReadinessGates:[]core.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), Overhead:core.ResourceList(nil), EnableServiceLinks:(*bool)(nil), TopologySpreadConstraints:[]core.TopologySpreadConstraint(nil)}}: field is immutable

What are your Helm values?

The values that we merge with values.yaml via Terraform are as follows:

--- 
dags: 
  gitSync:
    enabled: true
    repo: "${var.airflow_config.github_repository_url}"
    branch: "airflow-test"
    revision: "HEAD"
    syncWait: 60

airflow:
  config:
    AIRFLOW__WEBSERVER__ENABLE_PROXY_FIX: "True"
  extraEnv:
    - 
      name: PYTHONPATH
      value: "/opt/python/site-packages:/opt/airflow/dags/repo"
  extraPipPackages:
    - "apache-airflow[google]==2.0.1"
    - "SQLAlchemy==1.3.23"

externalDatabase:
  type: ${var.airflow_config.db_type}
  host: ${var.airflow_config.db_host}
  port: ${var.airflow_config.db_port}
  database: ${var.airflow_config.db_name}
  user: ${var.airflow_config.db_user}
  passwordSecret: "${var.airflow_config.db_password_secret}"
  passwordSecretKey: "${var.airflow_config.db_password_secret_key}"

postgresql:
  enabled: false

What is your Kubernetes Version?:

Client Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.4", GitCommit:"d360454c9bcd1634cf4cc52d1867af5491dc9c5f", GitTreeState:"clean", BuildDate:"2020-11-12T01:09:16Z", GoVersion:"go1.15.4", Compiler:"gc", Platform:"darwin/amd64"}
Server Version: version.Info{Major:"1", Minor:"16+", GitVersion:"v1.16.15-gke.7800", GitCommit:"cef3156c566a1d1a4b23ee360a760f45bfbaaac1", GitTreeState:"clean", BuildDate:"2020-12-14T09:12:37Z", GoVersion:"go1.13.15b4", Compiler:"gc", Platform:"linux/amd64"}

What is your Helm version?:

version.BuildInfo{Version:"v3.4.0", GitCommit:"7090a89efc8a18f3d8178bf47d2462450349a004", GitTreeState:"dirty", GoVersion:"go1.15.3"}
thesuperzapper commented 3 years ago

@Yinabled can you confirm if 8.0.1 fixed this issue?

Yinabled commented 3 years ago

@thesuperzapper Yes, this worked like a charm. Thanks a bunch!

inventionlabsSydney commented 3 years ago

@thesuperzapper It appears this exact error has reappeared in tag airflow-8.0.9, it is being caused by a new docker image tag being passed. Utilising Terraform

Error: cannot patch "airflow-upgrade-db" with kind Job: Job.batch "airflow-upgrade-db" is invalid: spec.template: Invalid value: core.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app":"airflow", "chart":"airflow-8.0.9", "component":"jobs", "controller-uid":"57f2d360-ba36-4a9b-bf12-8a3127d28213", "heritage":"Helm", "job-name":"airflow-upgrade-db", "release":"airflow"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:core.PodSpec{Volumes:[]core.Volume(nil), InitContainers:[]core.Container{core.Container{Name:"check-db", Image:"digitalmaas/airflow-core:6a97d81d581c19cf870b1130b3b38a3a20b1682e", Command:[]string{"/usr/bin/dumb-init", "--"}, Args:[]string{"bash", "-c", "exec timeout 60s airflow db check"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource{core.EnvFromSource{Prefix:"", ConfigMapRef:(*core.ConfigMapEnvSource)(nil), SecretRef:(*core.SecretEnvSource)(0xc009d238c0)}}, Env:[]core.EnvVar{core.EnvVar{Name:"DATABASE_PASSWORD", Value:"", ValueFrom:(*core.EnvVarSource)(0xc009d23920)}, core.EnvVar{Name:"REDIS_PASSWORD", Value:"", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"PIP_NO_BINARY", Value:"multidict,yarl", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil)}, VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(0xc005bbc660), Stdin:false, StdinOnce:false, TTY:false}}, Containers:[]core.Container{core.Container{Name:"upgrade-db", Image:"digitalmaas/airflow-core:6a97d81d581c19cf870b1130b3b38a3a20b1682e", Command:[]string{"/usr/bin/dumb-init", "--"}, Args:[]string{"bash", "-c", "exec airflow db upgrade"}, WorkingDir:"", Ports:[]core.ContainerPort(nil), EnvFrom:[]core.EnvFromSource{core.EnvFromSource{Prefix:"", ConfigMapRef:(*core.ConfigMapEnvSource)(nil), SecretRef:(*core.SecretEnvSource)(0xc009d23aa0)}}, Env:[]core.EnvVar{core.EnvVar{Name:"DATABASE_PASSWORD", Value:"", ValueFrom:(*core.EnvVarSource)(0xc009d23ae0)}, core.EnvVar{Name:"REDIS_PASSWORD", Value:"", ValueFrom:(*core.EnvVarSource)(nil)}, core.EnvVar{Name:"PIP_NO_BINARY", Value:"multidict,yarl", ValueFrom:(*core.EnvVarSource)(nil)}}, Resources:core.ResourceRequirements{Limits:core.ResourceList(nil), Requests:core.ResourceList(nil)}, VolumeMounts:[]core.VolumeMount(nil), VolumeDevices:[]core.VolumeDevice(nil), LivenessProbe:(*core.Probe)(nil), ReadinessProbe:(*core.Probe)(nil), StartupProbe:(*core.Probe)(nil), Lifecycle:(*core.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*core.SecurityContext)(0xc005bbc780), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]core.EphemeralContainer(nil), RestartPolicy:"OnFailure", TerminationGracePeriodSeconds:(*int64)(0xc00aa48660), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", SecurityContext:(*core.PodSecurityContext)(0xc014d13400), ImagePullSecrets:[]core.LocalObjectReference{core.LocalObjectReference{Name:"dockerhub"}}, Hostname:"", Subdomain:"", SetHostnameAsFQDN:(*bool)(nil), Affinity:(*core.Affinity)(nil), SchedulerName:"default-scheduler", Tolerations:[]core.Toleration(nil), HostAliases:[]core.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), PreemptionPolicy:(*core.PreemptionPolicy)(nil), DNSConfig:(*core.PodDNSConfig)(nil), ReadinessGates:[]core.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), Overhead:core.ResourceList(nil), EnableServiceLinks:(*bool)(nil), TopologySpreadConstraints:[]core.TopologySpreadConstraint(nil)}}: field is immutable
  on airflow.tf line 11, in resource "helm_release" "airflow":
  11: resource "helm_release" "airflow" {
Releasing state lock. This may take a few moments...
thesuperzapper commented 3 years ago

@inventionlabsSydney are you using helmWait=true?

Because if not, helm should be successfully deleting any existing airflow-upgrade-db job before it tries to create a new one.

inventionlabsSydney commented 3 years ago

Hi @thesuperzapper ,

I am using helmWait=true is this a problem?

thesuperzapper commented 3 years ago

@Yinabled @inventionlabsSydney this issue should no longer happen after version 8.4.0 of the chart!