gavinbunney / terraform-provider-kubectl

Terraform provider to handle raw kubernetes manifest yaml files
https://registry.terraform.io/providers/gavinbunney/kubectl
Mozilla Public License 2.0
619 stars 105 forks source link

failed to create kubernetes rest client for update of resource: Unauthorized #159

Closed brandonros closed 2 years ago

brandonros commented 2 years ago
2022-03-15T17:48:21.908-0400 [DEBUG] provider.terraform-provider-kubectl_v1.13.1.exe: 2022/03/15 17:48:21 [ERROR] creating manifest failed: nodejs-starter-deployment failed to create kubernetes rest client for update of resource: Unauthorized
2022-03-15T17:48:21.936-0400 [ERROR] vertex "module.np-cbs-eastus2-aks.kubectl_manifest.nodejs-starter-deployment" error: nodejs-starter-deployment failed to create kubernetes rest client for update of resource: Unauthorized

I obviously have a permissions issue somewhere but even with TF_LOG=debug I can't figure out what/where.

provider "kubectl" {
  host                   = data.azurerm_kubernetes_cluster.np-cbs-REDACTED-aks.kube_config.0.host
  username               = data.azurerm_kubernetes_cluster.np-cbs-REDACTED-aks.kube_config.0.username
  password               = data.azurerm_kubernetes_cluster.np-cbs-REDACTED-aks.kube_config.0.password
  cluster_ca_certificate = base64decode(data.azurerm_kubernetes_cluster.np-cbs-REDACTED-aks.kube_config.0.cluster_ca_certificate)
  client_certificate     = base64decode(data.azurerm_kubernetes_cluster.np-cbs-REDACTED-aks.kube_config.0.client_certificate)
  client_key             = base64decode(data.azurerm_kubernetes_cluster.np-cbs-REDACTED-aks.kube_config.0.client_key)
  load_config_file       = false
}
brandonros commented 2 years ago

I switched load_config_file to true and got rid of username/password/all certs + key config options and it seems to be ok. This was after making sure AKS style .kube/config was happy after az aks get-credentials command locally. Would be cool if there was more logging to know where/what/why was failing.

wjayesh commented 2 years ago

I got the same error when creating the cluster in EKS and this resource as part of the same terraform apply command.

The only way I currently have of fixing this is configuring my local .kube/config and then running apply again, similar to what you did. It'd be great if this could be part of a single flow.

iomarcovalente commented 2 years ago

We are having the same issue, after running terraform apply, and rollign updating one node in the cluster, terraform moved to apply some yaml files and the terraform apply crashed:

module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 10s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 20s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 30s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 40s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 50s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 1m0s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 1m10s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 1m20s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 1m30s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 1m40s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 1m50s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 2m0s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 2m10s elapsed]
module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0]: Still modifying... [id=/apis/karpenter.k8s.aws/v1alpha1/awsnodetemplates/default, 2m20s elapsed]
╷
│ Error: default failed to create kubernetes rest client for update of resource: Unauthorized
│ 
│   with module.eks_cluster_1.kubectl_manifest.karpenter_node_template[0],
│   on .terraform/modules/eks_cluster_1/modules/aws/eks/addon_karpenter.tf line 33, in resource "kubectl_manifest" "karpenter_node_template":
│   33: resource "kubectl_manifest" "karpenter_node_template" {
│ 
╵
Releasing state lock. This may take a few moments...

Subsequent apply shows no changes