Closed phiweger closed 3 years ago
In the advanced launch option, you can use the post-run script for doing that
Ah, that makes sense @pditommaso thx!
How can I pass data/ parameters to this field in the json request? Like this?
{
...
"postRunScript": "bash bin/compress.sh ${workflow.launchDir}/${params.results}.tar.gz ${workflow.launchDir}/${params.results}",
....
}
If not, do you have a mock example?
And, as stated in the documentation, where would I need to copy data to (wf.launchDir
?) in order for it to show up in the bucket?
I got so far as to have this simple example:
... "postRunScript": "#! bin/bash\nprintf foo; touch $NXF_WORK/bar", ...
It prints alright, but no "bar" file is created. $NXF_WORK
points to the right bucket, but how can I create/ move files there?
Well, there's no explicit parametrisation because this is not expected to be another pipeline step. you should provide explicitly the input/output expected.
I am trying to compress the
publishDir
directory (which is set withparams.results
) after the workflow completes. I want to reduce the amount of data to download, because I launch the workflow using the tower API on google.Here is my
onComplete
handler:compress.sh
simply containsHowever, the corresponding tar archive does not appear in my bucket. There is no error message about the handler in the log files.
This "feels" like nf does not stage files created by the handler, so they might be lost when the VMs are shut down. I am not sure
workflow.launchDir
is the right attribute for workflow introspection here, butworkflow.publishDir
does not exist right? My main question is, what am I doing wrong here? :)Thanks a lot!