awslabs / aws-java-nio-spi-for-s3

A Java NIO.2 service provider for Amazon S3
Apache License 2.0
63 stars 22 forks source link

Copy a file from a bucket to another with different credentials #534

Closed mirkoscotti closed 1 month ago

mirkoscotti commented 1 month ago

I need to copy files from my bucket to a bucket managed by an external owner. source and target bucket can be accessed with different credentials. I did not find a way to do this.

It seems that the library allows copying only when source and target buckets are accessible with the same credentials. Can you confirm this, and in this case I would ask you a new feature to cover my requirement, or show me a way within the current version? Thx.

markjschreiber commented 1 month ago

This is correct, you cannot actively hold two sets of AWS credentials in the same client and only one client can be used in a copy operation.

The best solution to this problem would be to use a role that has S3 permissions on the source bucket and which has been allowed via an IAM policy on the destination bucket (see https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucket-policies.html). This solution also removes the need for the bucket owner to create a role/ user for you in their account and also removes the need for any permanent credentials.

The only alternative to this would be to copy files to disk with one set of credentials and then in another client using different credentials, copy them to the destination bucket. This would be much less efficient than a direct copy on S3 due to the extra I/O.

mirkoscotti commented 1 month ago

Yes, I do not like the use of the intermediate file system... Ok, I will follow your hint. Thx very much.

stefanofornari commented 1 month ago

I actually think you can do it, creating two paths on different FS and using NIO; at least with s3x provider should work, if not, I would say it is a bug. It should be something like this:

Path p1 = Paths.get(URI.create("s3x://key1:secret1@some.where.com:1010/bucket1/source_file"));
Path p2 = Paths.get(URI.create("s3x://key2:secret2@some.where.com:1010/bucket2/destination_file"));

Files.copy(p1, p2);

I do not have an handy way to try it now, if it does not work, I'll check in the next days.

markjschreiber commented 1 month ago

This is a fair point. If you are using an AWS S3 bucket with IAM credentials then you can't do it (other than getting an IAM role with access to both buckets). If you are using something like minio or another S3 like system you could do it as @stefanofornari shows.