ovh / terraform-provider-ovh

Terraform OVH provider
https://registry.terraform.io/providers/ovh/ovh/latest/docs
Mozilla Public License 2.0
183 stars 135 forks source link

Update labels on multiple kube nodepools is unpredictable #350

Closed MircoT closed 1 year ago

MircoT commented 1 year ago

Hi,

I found a problem updating the labels of a Kubernetes pool. It seems that if you have more than one pool, the result state in not predictable and only a partial change is applied.

Terraform Version

Terraform v1.3.6

Affected Resource(s)

Terraform Configuration Files

terraform {
  required_version = ">= 1.3.6"

  required_providers {
    openstack = {
      source  = "terraform-provider-openstack/openstack"
      version = ">= 1.49.0"
    }

    ovh = {
      source  = "ovh/ovh"
      version = ">= 0.24.0"
    }
  }
}

provider "openstack" {
  auth_url    = "https://auth.cloud.ovh.net/v3/"
  domain_name = "default"
  alias       = "ovh"
}

provider "ovh" {
  alias              = "ovh"
  endpoint           = "ovh-eu"
  application_key    = ""
  application_secret = ""
  consumer_key       = ""
}

resource "ovh_cloud_project_kube" "k8s_cluster" {
  name               = "k8s-cluster"
  region             = "GRA9"
  version            = "1.24"
}

resource "ovh_cloud_project_kube_nodepool" "k8s_pool_a" {
  kube_id       = ovh_cloud_project_kube.k8s_cluster.id
  name          = "k8s-pool-a"
  flavor_name   = "d2-4"
  desired_nodes = 1
  template {
    metadata {
      annotations = {
        "label_a" = "disabled"
        "label_b" = "enabled"
      }
      labels = {
        "label_a" = "disabled"
        "label_b" = "enabled"
      }
    }
  }
}

resource "ovh_cloud_project_kube_nodepool" "k8s_pool_b" {
  kube_id       = ovh_cloud_project_kube.k8s_cluster.id
  name          = "k8s-pool-b"
  flavor_name   = "d2-4"
  desired_nodes = 1
  template {
    metadata {
      annotations = {
        "label_a" = "enabled"
        "label_b" = "enabled"
      }
      labels = {
        "label_a" = "enabled"
        "label_b" = "enabled"
      }
    }
  }
}

Debug Output

There is no useful debug output.

Panic Output

There is no useful panic output.

Expected Behavior

If I change the labels on both pools, the resulting apply of the plan have to be done correctly and the pools have to have the new labels.

Actual Behavior

One of the pool cold not have the new state and, consequently, the labels are not updated.

Steps to Reproduce

  1. terraform apply
  2. Change the labels for both pools
  3. terraform apply

References

I suppose that this issue is relative to the type of metadata section, that is a TypeSet. Also, the type of labels is a map. A similar problem during an update of those types can be found here:

jon4hz commented 1 year ago

Had the same issue a few days ago, but only with one nodepool. I tried to add a label, terraform returned success but in fact the label was never added.

matprig commented 1 year ago

Hello, i am actually on the issue, i have to double check and make further test to be sure that everything is ok best,

matprig commented 1 year ago

hello, The new release contains the fix for this issue. https://github.com/ovh/terraform-provider-ovh/releases/tag/v0.26.0 best,

yomovh commented 1 year ago

Fixed in 0.26