SectorLabs / celery-cloudwatch

Uploads results of Celery tasks to AWS CloudWatch.
MIT License
43 stars 9 forks source link

Fall back to IAM policy #1

Open lllama opened 7 years ago

lllama commented 7 years ago

Boto on an EC2 instance can grab its credentials from AWS directly, rather than from environment variables.

Would it be possible to allow for this? I've monkey patched my local version to just ignore the env vars, but it's not a great solution.

Photonios commented 7 years ago

I've just pushed a commit that makes this possible :)

https://github.com/SectorLabs/celery-cloudwatch/commit/b3badb5a4a211ce6c2096e7117f44a2f09fcaa05

Photonios commented 7 years ago

Please let me know if this works for you. If so, I'll publish a new version on PyPi.

lllama commented 7 years ago

I think that setting the access key and secret to "None" in the aws config dict will stop boto from authenticating (well it tries to auth with "None" and "None").

I'll try the commit to confirm though. Will try to get to it asap.

Photonios commented 7 years ago

That's pretty much what happens now. If you don't supply the environment variables, it will use None, which will cause Boto to fall back to other methods, such as reading the ~/.aws/credentials file.

lllama commented 7 years ago

So this seems to work, though I have errors about the describe_regions call, but that's because my IAM profile doesn't have those permissions.

dexterous commented 6 years ago

Has this been pushed out to PyPi?

dexterous commented 6 years ago

Simply removing the check for mandatory existence of the AWS_CLOUDWATCH_ACCESS_KEY and AWS_CLOUDWATCH_SECRET_KEY should do the job, the dummy call to ec2.describe_regions() is unnecessary. It could alternatively be replaced with a call to logs.describe_log_groups() instead.