pulumi / pulumi-policy-aws

A policy pack of rules to enforce AWS best practices for security, reliability, cost, and more!
https://www.pulumi.com
Apache License 2.0
33 stars 6 forks source link

Error validating resource with policy mfa-enabled-for-iam-console-access #91

Open jkodroff opened 1 year ago

jkodroff commented 1 year ago

What happened?

When consuming AWSGuard, my Pulumi program fails with the following error:

Diagnostics:
  aws:iam:UserLoginProfile (no-mfa):
    error: Error validating resource with policy mfa-enabled-for-iam-console-access:
    Error: connect EHOSTDOWN 169.254.169.254:80 - Local (192.168.1.127:59777)
        at internalConnect (node:net:1053:16)
        at defaultTriggerAsyncIdScope (node:internal/async_hooks:464:18)
        at node:net:1156:9
        at processTicksAndRejections (node:internal/process/task_queues:77:11)

Steps to reproduce

Pulumi program:

"""An AWS Python Pulumi program"""

import pulumi_aws as aws

my_user = aws.iam.User(
    "my-user",
    aws.iam.UserArgs(
        force_destroy=True,
    )
)

aws.iam.UserLoginProfile(
    "no-mfa",
    aws.iam.UserLoginProfileArgs(
        user=my_user.name,
    ),
)

Policy pack:

import { AwsGuard } from "@pulumi/awsguard";

new AwsGuard({
  all: "advisory",
});

Expected Behavior

No error.

Actual Behavior

Error.

Output of pulumi about

CLI          
Version      3.46.1
Go Version   go1.19.3
Go Compiler  gc

Plugins
NAME    VERSION
aws     5.20.0
python  unknown

Host     
OS       darwin
Version  12.6
Arch     arm64

This project is written in python: executable='/Users/jkodroff/tmp/workshop-policy-as-code/infra/venv/bin/python3' version='3.10.8
'

Current Stack: dev

TYPE                  URN
pulumi:pulumi:Stack   urn:pulumi:dev::workshop-policy-as-code::pulumi:pulumi:Stack::workshop-policy-as-code-dev
pulumi:providers:aws  urn:pulumi:dev::workshop-policy-as-code::pulumi:providers:aws::default_5_20_0
aws:s3/bucket:Bucket  urn:pulumi:dev::workshop-policy-as-code::aws:s3/bucket:Bucket::log-bucket
aws:s3/bucket:Bucket  urn:pulumi:dev::workshop-policy-as-code::aws:s3/bucket:Bucket::bucket-without-logging
aws:s3/bucket:Bucket  urn:pulumi:dev::workshop-policy-as-code::aws:s3/bucket:Bucket::bucket-with-logging

Found no pending operations associated with dev

Backend        
Name           pulumi.com
URL            https://app.pulumi.com/jkodroff
User           jkodroff
Organizations  jkodroff, jkodrofftest, demo, pulumi

Dependencies:
NAME        VERSION
pip         22.3.1
pulumi-aws  5.20.0
setuptools  65.5.1
wheel       0.38.3

Pulumi locates its logs in /var/folders/5m/4n1x3f8151s35wc80w06z5k80000gn/T/ by default

Additional context

No response

Contributing

Vote on this issue by adding a 👍 reaction. To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).

aq17 commented 1 year ago

This seems to be a nodejs issue with worker threads not having the proper AWS config/ env variables passed down https://github.com/dherault/serverless-offline/issues/1430

jkodroff commented 1 year ago

@aq17 I'm not deliberately setting that flag. Is this something that's part of this library, or pulumi-policy (node implementaton) as a whole?

aq17 commented 1 year ago

I think the latter – I'd like to check with someone more familiar with the node implementation to be sure though

rshade commented 1 year ago

It seems to be trying to reach the aws metadata service for some reason: 169.254.169.254:80. it doesn't do this on my workstation @jkodroff

markanye commented 1 year ago

I don't know if this will help you. But I ran into a similar problem while using websockets in serverless offline mode. When attempting to send a message back to a connected client, my serverless node process would quit with the following error:

Environment: darwin, node 18.12.1, framework 3.25.1 (local), plugin 6.2.2, SDK 4.3.2
Credentials: Serverless Dashboard, "skylight-aws" provider (https://app.serverless.com/marknye/apps/skylightng/skylightng/dev/us-east-1/providers)
Docs:        docs.serverless.com
Support:     forum.serverless.com
Bugs:        github.com/serverless/serverless/issues

Error:
Error: connect EHOSTUNREACH 169.254.169.254:80 - Local (192.168.10.8:57282)
    at internalConnect (node:net:1053:16)
    at defaultTriggerAsyncIdScope (node:internal/async_hooks:464:18)
    at node:net:1156:9
    at process.processTicksAndRejections (node:internal/process/task_queues:77:11)

The problem ended up being a missing ~/.aws directory. I restored the directory with config/credential for my AWS development account, and the error went away. Off course, I don't understand why serverless offline needs creds to locally emulate AWS websocket support, but that's a question for another day.