epam / cloud-pipeline

Cloud agnostic genomics analysis, scientific computation and storage platform
https://cloud-pipeline.com
Apache License 2.0
144 stars 59 forks source link

Support standard streams in pipe storage cp #3391

Closed tcibinan closed 10 months ago

tcibinan commented 10 months ago

Relates #3388.

The pull request brings support for standard streams to pipe storage cp for AWS and GCP.

Check pipe storage cp --help output to found out details:

Usage: pipe.py storage cp [OPTIONS] SOURCE DESTINATION

  Copies files/directories between data storages or between a local
  filesystem and a data storage.

  Examples:

  I. Examples of copying local data to a remote storage.

  Upload a local file (file.txt) to a storage (s3://storage/file.txt):

      pipe storage cp file.txt s3://storage/file.txt

  Upload a local directory (dir) to a storage (s3://storage/dir):

      pipe storage cp -r dir s3://storage/dir

  [Linux] Upload a stream from standard input (-) to a storage
  (s3://storage/file.txt):

      cat file.txt | pipe storage cp - s3://storage/file.txt

  II. Examples of copying remote storage data locally.

  Download a storage file (s3://storage/file.txt) as a local file
  (file.txt):

      pipe storage cp s3://storage/file.txt file.txt

  Download a storage directory (/common/workdir/dir) as a local directory
  (dir):

      pipe storage cp -r s3://storage/dir dir

  [Linux] Download a storage file (s3://storage/file.txt) as a stream to
  standard output (-):

      pipe storage cp s3://storage/file.txt - > file.txt