claudiajs / claudia

Deploy Node.js projects to AWS Lambda and API Gateway easily
https://claudiajs.com
MIT License
3.8k stars 274 forks source link

updating to v5.12.0 results in a timeout error #211

Closed adrai closed 4 years ago

adrai commented 4 years ago
gojko commented 4 years ago

Hi - I upgraded the aws-sdk to latest for the release; but all the integration tests are passing OK on my end. This might be due to some difference in how the latest sdk reads out the credentials, based on the error message; how do you provide credentials to claudia (is it in env variables, or the credentials files, are you using a STS role?)

adrai commented 4 years ago

Iā€™m using default credential files + https://github.com/adrai/aws-auto-assume-role#without-touching-your-code (since years)

used like this: AWS_PROFILE=myprofile node -r aws-auto-assume-role ./node_modules/.bin/claudia update --version prod

btw: without aws-auto-assume-role I get the same error (which was the same behaviour, years ago, before using aws-auto-assume-role)

gojko commented 4 years ago

without being able to reproduce it I can't really fix it, but perhaps if you help me figure out where the problem is, we could do it together.

can you try just a clean node project with aws-sdk@2.599.0, no claudia, and see if you can connect to lambda (eg try executing lambda.listFunctions()). If that reproduces the problem, then I need to downgrade to an earlier aws-sdk version. if it doesn't, then it's somewhere in how we initialise the connection.

gojko commented 4 years ago

btw, do you have an of these environment variables set: AWS_MFA_SERIAL and AWS_ROLE_ARN. If so, claudia might be trying to re-initialise the role after your script assumes it. You can now just provide the role ARN with --sts-role-arn directly to claudia

adrai commented 4 years ago

this morning I tried a whole range of aws-sdk versions... it's always the same, the only thing that triggers this error is the update to v5.12.0... AWS_MFA_SERIAL and AWS_ROLE_ARN are not set...

btw. as soon as I'm back to my machine... I will try some stuff...

adrai commented 4 years ago

without being able to reproduce it I can't really fix it, but perhaps if you help me figure out where the problem is, we could do it together.

can you try just a clean node project with aws-sdk@2.599.0, no claudia, and see if you can connect to lambda (eg try executing lambda.listFunctions()). If that reproduces the problem, then I need to downgrade to an earlier aws-sdk version. if it doesn't, then it's somewhere in how we initialise the connection.

executing the lambda.listFunctions script like this: AWS_PROFILE= myprofile node -r aws-auto-assume-role test.js works

executing the lambda.listFunctions script like this: AWS_PROFILE= myprofile node test.js behaves exactly like with claudia... hangs for 2 minutes and then errors with:

Error: connect ETIMEDOUT 169.254.169.254:80
    at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1129:14) {
  message: 'Missing credentials in config',
  errno: 'ETIMEDOUT',
  code: 'CredentialsError',
  syscall: 'connect',
  address: '169.254.169.254',
  port: 80,
  time: 2020-01-20T20:19:31.961Z,
  originalError: {
    message: 'Could not load credentials from any providers',
    errno: 'ETIMEDOUT',
    code: 'CredentialsError',
    syscall: 'connect',
    address: '169.254.169.254',
    port: 80,
    time: 2020-01-20T20:19:31.961Z,
    originalError: {
      message: 'EC2 Metadata roleName request returned error',
      errno: 'ETIMEDOUT',
      code: 'ETIMEDOUT',
      syscall: 'connect',
      address: '169.254.169.254',
      port: 80,
      time: 2020-01-20T20:19:31.961Z,
      originalError: [Object]
    }
  }
}
adrai commented 4 years ago

interesting finding: in my app's. package.json:

"claudia": "5.12.0"
"aws-sdk": "2.607.0"

clean install: rm -rf node_modules && rm -f package-lock.json && npm i => result as described at the beginng.... does not work if I now manually remove the aws-sdk from node_modules/claudia/node_modules -> so: rm -rf node_modules/claudia/node_modules/aws-sdk => it works

the reason: For aws-auto-assume-role to work, the required aws-sdk dependency must be the same: https://github.com/adrai/aws-auto-assume-role/blob/master/index.js#L42

Could the aws-sdk dependency in claudia as peerDependency be a solution?

adrai commented 4 years ago

as an alternative I could do this: image

gojko commented 4 years ago

since you're using a more recent version, I guess the best course of action is to upgrade claudia to a more recent aws-sdk

adrai commented 4 years ago

if this is ok for you, it is also ok for me šŸ˜‰

gojko commented 4 years ago

I've upgraded aws-sdk to latest, pushed a build to the ci server, if all the tests pass I'll publish to NPM tomorrow (it's already fairly late here); meanwhile, you should be able to install claudia directly from git instead of npm, perhaps give it a try and see if that fixes the problem.

adrai commented 4 years ago

no rush... here too šŸ˜‰

adrai commented 4 years ago

fyi: the version directly from the repo master, also installs a dedicated aws-sdk module.... so it's not working

adrai commented 4 years ago

when removing the shrinkwrap file, it works: https://github.com/adrai/claudia/commit/47c8e13f682acead363a14eabc22ed22d6956750

gojko commented 4 years ago

removing the shrinkwrap is not a good idea, because then we can't guarantee that the other dependencies will be ok after installation; does your role work if you set it using --sts-role-arn instead of magically requiring the additional module?

adrai commented 4 years ago

yes, using --sts-role-arn works, but then if individual users working in the same project have different roles, they all need to adjust this... (so the claudia commands can't be shared anymore)... don't worry... will workaround...

adrai commented 4 years ago

If you want, we can close this issue

adrai commented 4 years ago

Just a little question: Why is the npm-shrinkwrap.json file included in the 5.12.0 package but not in the 5.11.0 package? https://registry.npmjs.org/claudia/-/claudia-5.11.0.tgz https://registry.npmjs.org/claudia/-/claudia-5.12.0.tgz

gojko commented 4 years ago

I want to solve this, just making sure you have a workaround while we're fixing it :)

claudia will also take the STS role ARN from an environment variable (AWS_ROLE_ARN), so it doesn't have to be in the command - allowing team members to still share the commands.

so the assumption that both claudia and the role assuming script need the same aws-sdk is wrong? Both the repo on git and your code now use the same version (2.607.0).

Does npm dedupe help?

the shrinkwrap should have been included in 5.11 as well, if it was not, that was probably due to a npm publish bug when I published it.

adrai commented 4 years ago

so the assumption that both claudia and the role assuming script need the same aws-sdk is wrong? Both the repo on git and your code now use the same version (2.607.0).

They must not be only the same version but the same "object": require('aws-sdk') in the app's code and in claudia should require the same aws-sdk "object".

I want to solve this, just making sure you have a workaround while we're fixing it :)

But, really nothing to worry about... seems I'm the only one using aws-auto-assume-role ;-)

gojko commented 4 years ago

They must not be only the same version but the same "object": require('aws-sdk') in the app's code and in claudia should require the same aws-sdk "object".

my understanding of how node.js works is that this should be out of the box. two requires on the same module should return the same object, as long as they are in the same memory space and use the same file system path.

can you try npm dedupe after npm install; this should move the sub-dependency of claudia one level up if they are on the same version.

adrai commented 4 years ago

yes, npm dedupe fixes it

gojko commented 4 years ago

ok, I guess it's some npm peculiarity then why this isn't out of the box, but there is a decent workaround now?

adrai commented 4 years ago

for me, yes.... will use: AWS_ROLE_ARN

adrai commented 4 years ago

at least we learned something ;-)