qubole / spark-on-lambda

Apache Spark on AWS Lambda
Apache License 2.0
151 stars 32 forks source link

Setup - Need S3 object read permission #1

Open luckyvaliin opened 6 years ago

luckyvaliin commented 6 years ago

As per instructions, I was trying to copy the file and errors out.

aws s3 cp s3://public-qubole/lambda/spark-2.1.0-bin-spark-lambda-2.1.0.tgz s3://myBucket-xxx/spark-on-lambda/spark-2.1.0-bin-spark-lambda-2.1.0.tgz --acl bucket-owner-full-control

Error: fatal error: An error occurred (403) when calling the HeadObject operation: Forbidden

Could you provide appropriate access for the users to access the neccesary files.

Thanks Vali

venkata91 commented 6 years ago

Can you try running aws s3 ls s3://public-qubole/lambda/spark-2.1.0-bin-spark-lambda-2.1.0.tgz? I tried with a different set of keys and the copy worked.

luckyvaliin commented 6 years ago

Yes, certainly i am able to list the object and infact many other objects under that hierarchy. But the copy doesn't work. I checked with AWS support and they confirmed that the owner object permission is not properly set for it to be copied. Let me know if you have any questions.

Thanks Mahaboob Vali Shaik

On Thu, Dec 7, 2017 at 2:36 PM, Venkata krishnan Sowrirajan < notifications@github.com> wrote:

Can you try running aws s3 ls s3://public-qubole/lambda/ spark-2.1.0-bin-spark-lambda-2.1.0.tgz? I tried with a different set of keys and the copy worked.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/qubole/spark-on-lambda/issues/1#issuecomment-350071575, or mute the thread https://github.com/notifications/unsubscribe-auth/AMmZZJm7aWczsoz4OVv2hUA3xUAXamKOks5s-D4lgaJpZM4Q55fM .

venkata91 commented 6 years ago

Alright, that was an issue from our side, the keys owner didn't allow public to do GET operation. I think now it should be fine, we changed the owner permissions. Can you please try again and let us know if that fixes the issue?