jenkinsci / pipeline-aws-plugin

Jenkins Pipeline Step Plugin for AWS
https://plugins.jenkins.io/pipeline-aws/
Apache License 2.0
431 stars 202 forks source link

S3Download wipes C drive if target set as "/" #215

Open petertweed opened 4 years ago

petertweed commented 4 years ago

Description

I wish to download all items from a folder within an S3 bucket on windows into the workspace.
However the job has tried (mainly successfully) deleting everything on my hard disk, destroying the machine.

Code on pipeline (bucket name is sanitised):

        withAWS(region: 'eu-west-2', credentials: 'MyCreds'){
            s3Download(
                file: '/',
                bucket:'myBucket',
                path: 'product/',
                force:true)
        }

Log output:

[Pipeline] withAWS
Constructing AWS CredentialsSetting AWS region eu-west-2 
 [Pipeline] {
[Pipeline] s3Download
Downloading s3://myBucket/product/ to file:/C:/ 
 [Pipeline] }
[Pipeline] // withAWS

Expected behaviour: I expected it to download (and overwrite) files in the workspace, which is: C:\Program Files (x86)\Jenkins\workspace\Project-Windows\Product\DownloadProductS3 I didn't expect it to wipe everything!!!

Environment

Jenkins-Version: 2.204.1

Java-Version: 1.8

Plugin-Version: 1.39

Master/Slave Setup: [Do you run in a master/slave setup?] Ran on Master

ronronshaiko commented 4 years ago

+1 This command erased most of the files in a very important machine ! Can you please fix it ? :(

Command: s3Download(file: "/", bucket: S3_BUCKET, path:'sample_folder', force:true)

Thanks

hoegertn commented 4 years ago

the file parameter is an absolute file name on your disk if you start with /. An interesting thing is that your Jenkins has the necessary rights to do this. I would have expected it to fail due to missing permissions.

I am not sure how to "fix" this.

NivardoX commented 3 years ago

@hoegertn Hey, it happened to me(in linux) but it only deleted the files it had permission. I guess it's worth a warning.

RaiaN commented 10 months ago

THIS PLUGIN IS ABSOLUTELY DANGEROUS TO USE!

This issue still not fixed? s3Download(file:"${params.projectId}/", bucket : env.S3_BUCKET, path:"${params.projectId}/", force:true)

Please be very very careful!!!! This command erased half of my drive when I forgot to set ${params.projectId} i.e. when the parameter value was "".

P.S. I lost some photos and even whole workspace with tons of different configurations! Jesus Christ!

@hoegertn

RaiaN commented 10 months ago

the file parameter is an absolute file name on your disk if you start with /. An interesting thing is that your Jenkins has the necessary rights to do this. I would have expected it to fail due to missing permissions.

I am not sure how to "fix" this.

Maybe only allow relative paths? Or never go above workspace directory?

hoegertn commented 9 months ago

Thanks for pointing out this issue with the s3Download command. I get why this is a big deal, especially with the data loss you faced. However, I've decided not to make changes to the plugin at this point for a couple of reasons:

Avoiding Breaks for Others: Changing how s3Download works could mess up things for other users who haven't had any problems. I need to keep the plugin stable for everyone.

It's More About Jenkins and OS Settings: The problem is really about how Jenkins is set up and the permissions on your system, not just the plugin. It's super important to make sure your Jenkins config and OS permissions are tight and right for what you need.

Better Docs, Not Code Changes: I think the key here is to make sure everyone knows how to use the plugin safely. I'll work on beefing up the docs to highlight these kinds of risks and how to avoid them.

Sorry for any trouble this decision might cause. I'm all for making a safe and useful tool, and I believe better info and careful use is the way to go here.

RaiaN commented 9 months ago

@hoegertn Thank you for your reply. I agree that permissions must be set up.

However, I believe that force should act as override IF and only IF there is something to override and it actually needs to do that. Currently, if download fails some reason (i.e. nothing to download as in my case) then override will happen anyway!

It seems to me that the S3Download should never behave like "rmdir" or "rm -rf" one as it is pretty dangerous and non explicit (especially when path is / for whatever reason)

If there is a way to put [Deprecated] tag on S3Download and introduce a new API, which handles force option (instead implement override one) more gracefully then it would be a big big improvement.

RaiaN commented 3 months ago

@hoegertn Any plans to fix this function to make it less dangerous? Download logic SHOULD not hide "remove directory" logic !