lensapp / lens

Lens - The way the world runs Kubernetes
https://k8slens.dev/
MIT License
22.29k stars 1.45k forks source link

Error while connecting to cluster #7428

Open tautvydasbujauskas opened 1 year ago

tautvydasbujauskas commented 1 year ago

Describe the bug O noticed that on some of my clusters I get error

E0329 14:53:13.515135 12560 proxy_server.go:147] Error while proxying request: getting credentials: decoding stdout: couldn't get version/kind; json parse error: json: cannot unmarshal string into Go value of type struct { APIVersion string "json:\"apiVersion,omitempty\""; Kind string "json:\"kind,omitempty\"" }

I can access cluster with sake kubeconfig cia kubectl

Worth to mention:

Expected behavior Lens should connect to cluster, but it don't.

Screenshots If applicable, add screenshots to help explain your problem.

Environment (please complete the following information):

Logs: When you run the application executable from command line you will see some logging output. Please paste them here:

E0329 14:53:13.515135 12560 proxy_server.go:147] Error while proxying request: getting credentials: decoding stdout: couldn't get version/kind; json parse error: json: cannot unmarshal string into Go value of type struct { APIVersion string "json:\"apiVersion,omitempty\""; Kind string "json:\"kind,omitempty\"" }

Kubeconfig: Quite often the problems are caused by malformed kubeconfig which the application tries to load. Please share your kubeconfig, remember to remove any secret and sensitive information.

apiVersion: v1
clusters:
- cluster:
    certificate-authority-data: certinBase64
    server: https://aws-api.com
  name: arn:aws:eks:XXXXXXXXXXXXX:cluster/prod-cluster
contexts:
- context:
    cluster: arn:aws:eks:XXXXXXXXXXXXX:cluster/prod-cluster
    user: arn:aws:eks:XXXXXXXXXXXXX:cluster/prod-cluster
  name: arn:aws:eks:XXXXXXXXXXXXX:cluster/prod-cluster
current-context: arn:aws:eks:XXXXXXXXXXXXX:cluster/prod-cluster
kind: Config
preferences: {}
users:
- name: arn:aws:eks:XXXXXXXXXXXXX:cluster/prod-cluster
  user:
    exec:
      apiVersion: client.authentication.k8s.io/v1beta1
      args:
      - --region
      - us-east-1
      - eks
      - get-token
      - --cluster-name
      - prod-cluster
      command: aws
      env:
      - name: AWS_PROFILE
        value: XXXXXXXXXX

Additional context Add any other context about the problem here.

stefanodefaria commented 1 month ago

I had the same issue. To fix it, I had to get a new kubeconfig file using the aws update-kubeconfig command. Then I ran diff to compare the old and new kubeconfig files. Here's the output:

27,28d26
<       - --output
<       - json