Closed jfy133 closed 2 years ago
I was wondering while creating the module if there is ever a downside to not limiting the download size at all. I guess it could be somewhat unexpected to get a file that's close to 100 GB or so but then again the user chose the respective IDs... What do you think?
Yeah I would agree there... you should know what you're downloading. But on the otherhand maybe that's not people check when fetchngs is making it 'so easy' to download stuff?
I would be okay with setting the default args
to --max-size u
then it can still be overwritten.
I get the same error. All my fastq files above 40GB. Is there a quick fix? I tried adding --max-size
to the nextflow command but I continue to get the same error.
nextflow run nf-core/fetchngs -c params.config --max-size 60G
In your local config, you can set
process {
withName: SRATOOLS_PREFETCH {
ext.args = '--max-size 60g'
}
}
Looks like this is resolved so closing.
Hi @drpatelh. On NF Tower, since I'm a launch user, I don't have the permissions to modify this attribute and so it's not practical if I want to modify this for a specific run. Would it be possible to expose this --max-size
parameter in the GUI by default?
Description of feature
I was trying to download some data, and apparently one of the files was 'too big' for the sra tools prefetch thingy.
Seems like the solution is given in the message. I will try specifying it with a custom
modules.conf
, but if it works I think it would be good to add inbuilt support :+1: