This issue has one additional caveat. Searched diligently with no solution found.
When I run this python script on my ec2, it works with no issues. Once I run it on docker swarm Openfaas on the same ec2 it cannot find AWS credentials. I don't have aws credentials othen than the ones that are granted to the EC2. I can not hard code the credentials they have to be inherited from the ec2 instance.
Expected Behaviour
This python script should run and insert the data frame into a redshift database ( works fine on the ec2).
from sqlalchemy import create_engine
import awswrangler as wr
import pandas as pd
When I run it in the open fans docker swarm variant I get the below error.
pip install and everything works fine.
:
Traceback (most recent call last):
File "index.py", line 18, in
ret = handler.handle(st)
File "/home/app/function/handler.py", line 114, in handle
push_to_rs()
File "/home/app/function/handler.py", line 109, in push_to_rs
use_threads=True,
File "/home/app/python/awswrangler/db.py", line 787, in copy_to_redshift
max_rows_by_file=max_rows_by_file,
File "/home/app/python/awswrangler/_config.py", line 360, in wrapper
return function(args)
File "/home/app/python/awswrangler/s3/_write_parquet.py", line 532, in to_parquet
max_rows_by_file=max_rows_by_file,
File "/home/app/python/awswrangler/s3/_write_dataset.py", line 75, in _to_dataset
df=df, path_root=path_root, use_threads=use_threads, boto3_session=boto3_session, index=index, func_kwargs
File "/home/app/python/awswrangler/s3/_write_parquet.py", line 179, in _to_parquet
cpus=cpus,
File "/home/app/python/awswrangler/s3/_write_parquet.py", line 137, in _to_parquet_chunked
return proxy.close() # blocking
File "/home/app/python/awswrangler/s3/_write_concurrent.py", line 54, in close
self._results += future.result()
File "/opt/conda/lib/python3.7/concurrent/futures/_base.py", line 428, in result
return self.get_result()
File "/opt/conda/lib/python3.7/concurrent/futures/_base.py", line 384, in get_result
raise self._exception
File "/opt/conda/lib/python3.7/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, self.kwargs)
File "/home/app/python/awswrangler/s3/_write_concurrent.py", line 33, in _caller
return func(func_kwargs)
File "/home/app/python/awswrangler/s3/_write_parquet.py", line 106, in _write_chunk
writer.write_table(table.slice(offset, chunk_size))
File "/opt/conda/lib/python3.7/contextlib.py", line 119, in exit
next(self.gen)
File "/home/app/python/awswrangler/s3/_write_parquet.py", line 85, in _new_writer
writer.close()
File "/opt/conda/lib/python3.7/contextlib.py", line 119, in exit
next(self.gen)
File "/home/app/python/awswrangler/s3/_fs.py", line 597, in open_s3_object
s3obj.close()
File "/home/app/python/awswrangler/s3/_fs.py", line 541, in close
function_name="put_object", s3_additional_kwargs=self._s3_additional_kwargs
File "/home/app/python/awswrangler/_utils.py", line 287, in try_it
return f(kwargs)
File "/home/app/python/botocore/client.py", line 357, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/app/python/botocore/client.py", line 663, in _make_api_call
operation_model, request_dict, request_context)
File "/home/app/python/botocore/client.py", line 682, in _make_request
return self._endpoint.make_request(operation_model, request_dict)
File "/home/app/python/botocore/endpoint.py", line 102, in make_request
return self._send_request(request_dict, operation_model)
File "/home/app/python/botocore/endpoint.py", line 132, in _send_request
request = self.create_request(request_dict, operation_model)
File "/home/app/python/botocore/endpoint.py", line 116, in create_request
operation_name=operation_model.name)
File "/home/app/python/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, kwargs)
File "/home/app/python/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/home/app/python/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/home/app/python/botocore/signers.py", line 90, in handler
return self.sign(operation_name, request)
File "/home/app/python/botocore/signers.py", line 162, in sign
auth.add_auth(request)
File "/home/app/python/botocore/auth.py", line 357, in add_auth
raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials
Exception ignored in: <function _S3Object.del at 0x7f33986d4560>
Traceback (most recent call last):
File "/home/app/python/awswrangler/s3/_fs.py", line 257, in del
File "/home/app/python/awswrangler/s3/_fs.py", line 541, in close
File "/home/app/python/awswrangler/_utils.py", line 287, in try_it
File "/home/app/python/botocore/client.py", line 357, in _api_call
File "/home/app/python/botocore/client.py", line 663, in _make_api_call
File "/home/app/python/botocore/client.py", line 682, in _make_request
File "/home/app/python/botocore/endpoint.py", line 102, in make_request
File "/home/app/python/botocore/endpoint.py", line 132, in _send_request
File "/home/app/python/botocore/endpoint.py", line 116, in create_request
File "/home/app/python/botocore/hooks.py", line 356, in emit
File "/home/app/python/botocore/hooks.py", line 228, in emit
File "/home/app/python/botocore/hooks.py", line 211, in _emit
File "/home/app/python/botocore/signers.py", line 90, in handler
File "/home/app/python/botocore/signers.py", line 162, in sign
File "/home/app/python/botocore/auth.py", line 357, in add_auth
botocore.exceptions.NoCredentialsError: Unable to locate credentials
Possible Solution
Steps to Reproduce (for bugs)
I'm using the standard python-Debian but did change the docker image in the dockerfile to FROM continuumio/miniconda3
My actions before raising this issue
When I run this python script on my ec2, it works with no issues. Once I run it on docker swarm Openfaas on the same ec2 it cannot find AWS credentials. I don't have aws credentials othen than the ones that are granted to the EC2. I can not hard code the credentials they have to be inherited from the ec2 instance.
Expected Behaviour
This python script should run and insert the data frame into a redshift database ( works fine on the ec2).
from sqlalchemy import create_engine import awswrangler as wr import pandas as pd
def con_rs_bld(): con_rs = wr.db.get_engine( db_type="postgresql", host="XXXXXXXXXXXXXX", port=55555555, database="XXXXXXXXXXXXXX", user="XXXXXXXXXXXXXX", password="XXXXXXXXXXXXXX", ) return con_rs
def push_to_rs():
pandas dataframe for testing
def handle(req): push_to_rs()
Current Behaviour
When I run it in the open fans docker swarm variant I get the below error. pip install and everything works fine.
:
Traceback (most recent call last): File "index.py", line 18, in
ret = handler.handle(st)
File "/home/app/function/handler.py", line 114, in handle
push_to_rs()
File "/home/app/function/handler.py", line 109, in push_to_rs
use_threads=True,
File "/home/app/python/awswrangler/db.py", line 787, in copy_to_redshift
max_rows_by_file=max_rows_by_file,
File "/home/app/python/awswrangler/_config.py", line 360, in wrapper
return function(args)
File "/home/app/python/awswrangler/s3/_write_parquet.py", line 532, in to_parquet
max_rows_by_file=max_rows_by_file,
File "/home/app/python/awswrangler/s3/_write_dataset.py", line 75, in _to_dataset
df=df, path_root=path_root, use_threads=use_threads, boto3_session=boto3_session, index=index, func_kwargs
File "/home/app/python/awswrangler/s3/_write_parquet.py", line 179, in _to_parquet
cpus=cpus,
File "/home/app/python/awswrangler/s3/_write_parquet.py", line 137, in _to_parquet_chunked
return proxy.close() # blocking
File "/home/app/python/awswrangler/s3/_write_concurrent.py", line 54, in close
self._results += future.result()
File "/opt/conda/lib/python3.7/concurrent/futures/_base.py", line 428, in result
return self.get_result()
File "/opt/conda/lib/python3.7/concurrent/futures/_base.py", line 384, in get_result
raise self._exception
File "/opt/conda/lib/python3.7/concurrent/futures/thread.py", line 57, in run
result = self.fn(*self.args, self.kwargs)
File "/home/app/python/awswrangler/s3/_write_concurrent.py", line 33, in _caller
return func(func_kwargs)
File "/home/app/python/awswrangler/s3/_write_parquet.py", line 106, in _write_chunk
writer.write_table(table.slice(offset, chunk_size))
File "/opt/conda/lib/python3.7/contextlib.py", line 119, in exit
next(self.gen)
File "/home/app/python/awswrangler/s3/_write_parquet.py", line 85, in _new_writer
writer.close()
File "/opt/conda/lib/python3.7/contextlib.py", line 119, in exit
next(self.gen)
File "/home/app/python/awswrangler/s3/_fs.py", line 597, in open_s3_object
s3obj.close()
File "/home/app/python/awswrangler/s3/_fs.py", line 541, in close
function_name="put_object", s3_additional_kwargs=self._s3_additional_kwargs
File "/home/app/python/awswrangler/_utils.py", line 287, in try_it
return f(kwargs)
File "/home/app/python/botocore/client.py", line 357, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/app/python/botocore/client.py", line 663, in _make_api_call
operation_model, request_dict, request_context)
File "/home/app/python/botocore/client.py", line 682, in _make_request
return self._endpoint.make_request(operation_model, request_dict)
File "/home/app/python/botocore/endpoint.py", line 102, in make_request
return self._send_request(request_dict, operation_model)
File "/home/app/python/botocore/endpoint.py", line 132, in _send_request
request = self.create_request(request_dict, operation_model)
File "/home/app/python/botocore/endpoint.py", line 116, in create_request
operation_name=operation_model.name)
File "/home/app/python/botocore/hooks.py", line 356, in emit
return self._emitter.emit(aliased_event_name, kwargs)
File "/home/app/python/botocore/hooks.py", line 228, in emit
return self._emit(event_name, kwargs)
File "/home/app/python/botocore/hooks.py", line 211, in _emit
response = handler(**kwargs)
File "/home/app/python/botocore/signers.py", line 90, in handler
return self.sign(operation_name, request)
File "/home/app/python/botocore/signers.py", line 162, in sign
auth.add_auth(request)
File "/home/app/python/botocore/auth.py", line 357, in add_auth
raise NoCredentialsError
botocore.exceptions.NoCredentialsError: Unable to locate credentials
Exception ignored in: <function _S3Object.del at 0x7f33986d4560>
Traceback (most recent call last):
File "/home/app/python/awswrangler/s3/_fs.py", line 257, in del
File "/home/app/python/awswrangler/s3/_fs.py", line 541, in close
File "/home/app/python/awswrangler/_utils.py", line 287, in try_it
File "/home/app/python/botocore/client.py", line 357, in _api_call
File "/home/app/python/botocore/client.py", line 663, in _make_api_call
File "/home/app/python/botocore/client.py", line 682, in _make_request
File "/home/app/python/botocore/endpoint.py", line 102, in make_request
File "/home/app/python/botocore/endpoint.py", line 132, in _send_request
File "/home/app/python/botocore/endpoint.py", line 116, in create_request
File "/home/app/python/botocore/hooks.py", line 356, in emit
File "/home/app/python/botocore/hooks.py", line 228, in emit
File "/home/app/python/botocore/hooks.py", line 211, in _emit
File "/home/app/python/botocore/signers.py", line 90, in handler
File "/home/app/python/botocore/signers.py", line 162, in sign
File "/home/app/python/botocore/auth.py", line 357, in add_auth
botocore.exceptions.NoCredentialsError: Unable to locate credentials
Possible Solution
Steps to Reproduce (for bugs)
I'm using the standard python-Debian but did change the docker image in the dockerfile to FROM continuumio/miniconda3
cli version version: 0.12.13
Provider name: faas-swarm orchestration: swarm version: 0.9.0
Your Environment
FaaS-CLI version ( Full output from:
faas-cli version
):Docker version
docker version
(e.g. Docker 17.0.05 ): Server: Docker Engine - Community Engine: Version: 19.03.13Are you using Docker Swarm or Kubernetes (FaaS-netes)? Docker Swarm
Operating System and version (e.g. Linux, Windows, MacOS): Linux