pulumi / pulumi-eks

A Pulumi component for easily creating and managing an Amazon EKS Cluster
https://www.pulumi.com/registry/packages/eks/
Apache License 2.0
171 stars 82 forks source link

Kubeconfig is missing AWS profile environment variable #1426

Open mjmottram opened 1 month ago

mjmottram commented 1 month ago

What happened?

Possibly related to https://github.com/pulumi/pulumi-eks/issues/1038, running with

{
  "name": "...",
  "dependencies": {
    "@pulumi/aws": "^6.8.0",
    "@pulumi/awsx": "^2.4.0",
    "@pulumi/eks": "^2.1.0",
    "@pulumi/kubernetes": "^4.6.1",
    "@pulumi/pulumi": "^3.104.0",
    "@pulumi/tls": "^4.11.1",
    "typescript": "^5.2.2"
  }
}

and a deployment containing

const cluster = new eks.Cluster(
...
);

export const kubeconfig = cluster.kubeconfig.apply(JSON.stringify);

Our cluster config contains the AWS profile that was used when running pulumi up via the aws:profile parameter in our pulumi stack config.

Running with

{
  "name": "...",
  "dependencies": {
    "@pulumi/aws": "^6.50.1",
    "@pulumi/awsx": "^2.14.0",
    "@pulumi/eks": "^2.7.9",
    "@pulumi/kubernetes": "^4.17.1",
    "@pulumi/pulumi": "^3.130.0",
    "@pulumi/tls": "^4.11.1",
    "typescript": "^5.2.2"
  }

The AWS profile is missing from the environment. Since our setup includes a main AWS account where we self-host our pulumi config on S3, and then separate AWS accounts for staging and production, this causes our subsequent deployment pipelines to attempt to access EKS using the default (i.e. main) AWS account.

Example

See description

Output of pulumi about

See description

Additional context

No response

Contributing

Vote on this issue by adding a 👍 reaction. To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).

mjmottram commented 1 month ago

I've just seen the change that caused this. It's no clear to me whether it's a bug that pulumi is not picking up the expected profile from aws:profile in our stack config, the comments in the linked change suggest we should set our AWS_PROFILE env var, but this would interfere with the self-hosted pulumi config on s3 in another account.

corymhall commented 1 month ago

@mjmottram I think you should be able to use the getKubeConfig({ profileName: 'my-profile' }) method which allows you to specify a specific profile to use.

Does this work for your use case?