rancher / terraform-provider-rke

Terraform provider plugin for deploy kubernetes cluster by RKE(Rancher Kubernetes Engine)
Mozilla Public License 2.0
340 stars 153 forks source link

[1.7] RKE v1.7.0 support #485

Closed jiaqiluo closed 2 days ago

jiaqiluo commented 2 days ago

This issue is used to track the effort for supporting RKE v1.7.0 https://github.com/rancher/rke/releases/tag/v1.7.0

jiaqiluo commented 2 days ago

This issue can be validated on v1.7.0-rc1

markusewalker commented 2 days ago

Validated that this is addressed. See details below:

ENVIRONMENT DETAILS

TEST RESULT # Scenario Result
1 Provision RKE1 cluster using TFP-RKE v1.7.0-rc1 :white_check_mark:

VALIDATION STEPS

Scenario 1

  1. On client node, utilize the following main.tf:
    
    terraform {
    required_providers {
    rke = {
      source  = "terraform.local/local/rke"
      version = "1.7.0-rc1"
    }
    }
    }

provider "rke" { log_file = var.log_file }

resource "rke_cluster" "cluster" { nodes { address = var.address1 user = var.user role = ["controlplane", "worker", "etcd"] ssh_key = file(var.ssh_key_path) } nodes { address = var.address2 user = var.user role = ["controlplane", "worker", "etcd"] ssh_key = file(var.ssh_key_path) } nodes { address = var.address3 user = var.user role = ["controlplane", "worker", "etcd"] ssh_key = file(var.ssh_key_path) } enable_cri_dockerd = true }

_NOTE:_ The kubeconfig is in `outputs.tf` and will not be shared here.
2. Run `terraform apply`.
3. Using the kubeconfig defined in `outputs.tf`, place that in `~/HOME/.kube/config`.
4. Run` kubectl get nodes`:

$ kubectl get nodes NAME STATUS ROLES AGE VERSION

Ready controlplane,etcd,worker 83s v1.31.2 Ready controlplane,etcd,worker 83s v1.31.2 Ready controlplane,etcd,worker 83s v1.31.2 ```