Open ghost opened 6 years ago
Using the same gsutil and cloud SDK version you mentioned above, I can copy files without issue from my S3 bucket to my GCS bucket. Judging from the 403, it's likely that you don't have some necessary permission for your operation. Off the top of my head, here are some permissions you might want to check (make sure you're checking permissions for the user corresponding to the S3 credentials you're using):
This is effective and useful list that you provided for process of elimination. But here's the after discovery of creating this ticket, I can with success copy each file at a time from the source location to the destination (s3 to google bucket). Thoughts?
My next thought is to do the unpleasant, which is dive into the bucket logging on s3 side. Do you have any logging on google cloud that you know is worth looking at in this troubleshooting process?
You could try running gsutil with the top-level -D
flag and checking which request it was that you got a 403 response for. If those logs are too jumbled together, I'd try breaking this up into two steps, which might also help isolate which cloud provider/request is the problem:
# 1) Copy from S3 to local disk
gsutil -D cp -r s3://secret-bucket/some_key/ /local/directory/my-s3-files/
# 2) Copy from local disk to GCS bucket
gsutil -D cp -r /local/directory/my-s3-files/ gs://secret-elsewhere/destination/
Doing a recursive copy fails:
gsutil version: 4.28
Google Cloud SDK 180.0.1 app-engine-python 1.9.63 beta 2017.09.15 bq 2.0.27 core 2017.11.20 gsutil 4.28
EDIT: actual error