Open tiborsimko opened 2 months ago
One possible workaround until we solve the problem is to "duplicate" the rules so that parameters don't have to be wildcards. (This could be acceptable in case the number of samples is small.) Here's the working example:
rule all:
input:
expand("output/dataset_{sample}.txt", sample=config["samples"].keys())
rule dataset_WW:
output:
"output/dataset_WW.txt"
container:
"docker://docker.io/reanahub/reana-env-root6:6.18.04"
params:
dataset = config["samples"].get("WW", "/UNKNOWNN")
resources:
kubernetes_memory_limit="256Mi"
shell:
"mkdir -p $(dirname {output}) && echo {params.dataset} > {output}"
rule dataset_DY:
output:
"output/dataset_DY.txt"
container:
"docker://docker.io/reanahub/reana-env-root6:6.18.04"
params:
dataset = config["samples"].get("DY", "/UNKNOWNN")
resources:
kubernetes_memory_limit="256Mi"
shell:
"mkdir -p $(dirname {output}) && echo {params.dataset} > {output}"
Current behaviour
Consider the following workflow example:
reana.yaml
content:inputs.yaml
content:Snakefile
content:This Snakemake example is specific in that the analysis would like to use wildcards in rules's parameters, which is usually done by means of lambda functions working on the wildcard object.
The local execution works well:
The submission of the same workflow to REANA does not pass:
Expected behaviour
It should be possible to create workflows that run well locally.
Notes
This problem may be best addressed as part of the "thin client" sprint when the client would send only files and the workflow creation will be fully done on the server side.
If it is possible to find a workaround in the client and server combination for the forthcoming 0.9.4 release, that would be even better.