lithops-cloud / lithops

A multi-cloud framework for big data analytics and embarrassingly parallel jobs, that provides an universal API for building parallel applications in the cloud ☁️🚀
http://lithops.cloud
Apache License 2.0
317 stars 105 forks source link

IBM COS SSL validation fail #63

Closed chen116 closed 5 years ago

chen116 commented 5 years ago

Hi,

I am trying to use pywren-ibm with IBM cloud function and ibm cos, after setting up an ibm cloud account and editing the ~/.pywren_config file, I am able to do ./deploy_runtime but when I try to execute python3 test/testpywren.py init , I got the following error:

Unable to create bucket: SSL validation failed for https://s3-api.us-geo.objectstorage.softlayer.net/pywrenbuck/__pywren.test/test0 [Errno 2] No such file or directory

I also tried with this end point https://s3.us.cloud-object-storage.appdomain.cloud but still got the same error. How should I fix this SSL issue, thanks

gilv commented 5 years ago

@chen116 this seems to be configuration issue related. What is COS bucket you configured in PyWren runtime? Can please you copy-paste your config file ( without credentials of course ) ?

chen116 commented 5 years ago

Hi,

Thanks for the reply, my conig file like a bit like this:

pywren:
    storage_bucket: pywrenbuck
    #storage_prefix: pywren.jobs
    #storage_backend: ibm_cos
    #data_cleaner: <True/False>
    #invocation_retry: <True/False>
    #retry_sleeps: [1, 5, 10, 20, 30]
    #retries: 5
ibm_cf:
    endpoint    : https://openwhisk.ng.bluemix.net
    namespace   : chen116@usc.edu_dev
    api_key     :  blah blah blah
ibm_cos:
    endpoint    : https://s3-api.us-geo.objectstorage.softlayer.net
    #(also tried with this url) endpoint   : https://s3.us-south.cloud-object-storage.appdomain.cloud
    api_key    : blah blah blah

Also I am not able to create my own runtime, for example, when I run ./deploy_runtime clone python/3.5-stretch the error I got is also related to ssl validation:

ibm_botocore.exceptions.SSLError: SSL validation failed for https://s3.us-south.cloud-object-storage.appdomain.cloud/pywrenbuck/runtimes/3.5-stretch.meta.json [Errno 2] No such file or directory

Thanks,

gilv commented 5 years ago

@chen116 do you have an existing bucket that has a name "pywrenbuck" ? This bucket need to be existed in advance prior you use it with PyWren. (Also make sure that bucket exists in the same region as you provided "s3-api.us-geo")

gilv commented 5 years ago

@chen116 few other issues i just notice: What is python/3.5-stretch ? Is this image you created somehow?

what is your Python version you are using, is it 3.5 or 3.6? If you use 3.5, then you can try to experiment with this command: ./deploy_runtime clone cactusone/pywren:3.5 this will clone 3.5 from my docker hub.. alternative you can edit Docker file, modify 3.6 to 3.5 and deploy it with your docker hub ./deploy_runtime create "your docker hub name"/pywren:3.5

If you use Python 3.6 then just run ./deploy_runtime without any modifications.

chen116 commented 5 years ago

Hi, thanks again for ur patience,

yes, I created the bucket "pywrenbuck" and obtained apikey thru the newly created Service credentials. I also changed the correct endpoint where my bucket is located at "https://s3.us-east.cloud-object-storage.appdomain.cloud" Still doesn't work. I also tried the ACCESS_KEY and the SECRET_KEY for ibm cos, but still having the same error about SSL validation error / [Errno 2] No such file or directory.

python/3.5-stretch is an existing docker image from the official python docker hub

I tried ./deploy_runtime clone cactusone/pywren:3.5 but still getting the error like this:

ibm_botocore.exceptions.SSLError: SSL validation failed for https://s3.us-east.cloud-object-storage.appdomain.cloud/pywrenbuck/runtimes/pywren_3.5.meta.json [Errno 2] No such file or directory

Any hint/directions welcome. Really wanna see it runs. Thanks, again!

chen116 commented 5 years ago

Hi, So I solved the issue, well, more like get around with it. So I was running pywren inside a vm in an openstack cluster, it must have been sth in the networking that caused the issue. Still, thanks for your replies along the way!

gilv commented 5 years ago

@chen116 glad it worked