Closed jiaqiluo closed 2 days ago
This issue can be validated on v1.7.0-rc1
Validated that this is addressed. See details below:
ENVIRONMENT DETAILS
v1.7.0-rc1
v1.31.2
TEST RESULT | # | Scenario | Result |
---|---|---|---|
1 | Provision RKE1 cluster using TFP-RKE v1.7.0-rc1 |
:white_check_mark: |
VALIDATION STEPS
Scenario 1
main.tf
:
terraform {
required_providers {
rke = {
source = "terraform.local/local/rke"
version = "1.7.0-rc1"
}
}
}
provider "rke" { log_file = var.log_file }
resource "rke_cluster" "cluster" { nodes { address = var.address1 user = var.user role = ["controlplane", "worker", "etcd"] ssh_key = file(var.ssh_key_path) } nodes { address = var.address2 user = var.user role = ["controlplane", "worker", "etcd"] ssh_key = file(var.ssh_key_path) } nodes { address = var.address3 user = var.user role = ["controlplane", "worker", "etcd"] ssh_key = file(var.ssh_key_path) } enable_cri_dockerd = true }
_NOTE:_ The kubeconfig is in `outputs.tf` and will not be shared here.
2. Run `terraform apply`.
3. Using the kubeconfig defined in `outputs.tf`, place that in `~/HOME/.kube/config`.
4. Run` kubectl get nodes`:
$ kubectl get nodes NAME STATUS ROLES AGE VERSION
This issue is used to track the effort for supporting RKE v1.7.0 https://github.com/rancher/rke/releases/tag/v1.7.0