Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Kubeconfig is missing AWS profile environment variable #1426

Open
mjmottram opened this issue Oct 7, 2024 · 2 comments
Open

Kubeconfig is missing AWS profile environment variable #1426

mjmottram opened this issue Oct 7, 2024 · 2 comments
Labels
awaiting-feedback Blocked on input from the author kind/bug Some behavior is incorrect or out of spec

Comments

@mjmottram
Copy link

mjmottram commented Oct 7, 2024

What happened?

Possibly related to #1038, running with

{
  "name": "...",
  "dependencies": {
    "@pulumi/aws": "^6.8.0",
    "@pulumi/awsx": "^2.4.0",
    "@pulumi/eks": "^2.1.0",
    "@pulumi/kubernetes": "^4.6.1",
    "@pulumi/pulumi": "^3.104.0",
    "@pulumi/tls": "^4.11.1",
    "typescript": "^5.2.2"
  }
}

and a deployment containing

const cluster = new eks.Cluster(
...
);

export const kubeconfig = cluster.kubeconfig.apply(JSON.stringify);

Our cluster config contains the AWS profile that was used when running pulumi up via the aws:profile parameter in our pulumi stack config.

Running with

{
  "name": "...",
  "dependencies": {
    "@pulumi/aws": "^6.50.1",
    "@pulumi/awsx": "^2.14.0",
    "@pulumi/eks": "^2.7.9",
    "@pulumi/kubernetes": "^4.17.1",
    "@pulumi/pulumi": "^3.130.0",
    "@pulumi/tls": "^4.11.1",
    "typescript": "^5.2.2"
  }

The AWS profile is missing from the environment. Since our setup includes a main AWS account where we self-host our pulumi config on S3, and then separate AWS accounts for staging and production, this causes our subsequent deployment pipelines to attempt to access EKS using the default (i.e. main) AWS account.

Example

See description

Output of pulumi about

See description

Additional context

No response

Contributing

Vote on this issue by adding a 👍 reaction.
To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).

@mjmottram mjmottram added kind/bug Some behavior is incorrect or out of spec needs-triage Needs attention from the triage team labels Oct 7, 2024
@mjmottram
Copy link
Author

I've just seen the change that caused this. It's no clear to me whether it's a bug that pulumi is not picking up the expected profile from aws:profile in our stack config, the comments in the linked change suggest we should set our AWS_PROFILE env var, but this would interfere with the self-hosted pulumi config on s3 in another account.

@corymhall
Copy link
Contributor

@mjmottram I think you should be able to use the getKubeConfig({ profileName: 'my-profile' }) method which allows you to specify a specific profile to use.

Does this work for your use case?

@corymhall corymhall added awaiting-feedback Blocked on input from the author and removed needs-triage Needs attention from the triage team labels Oct 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
awaiting-feedback Blocked on input from the author kind/bug Some behavior is incorrect or out of spec
Projects
None yet
Development

No branches or pull requests

2 participants