franciscozorrilla / metaGEM

:gem: An easy-to-use workflow for generating context specific genome-scale metabolic models and predicting metabolic interactions within microbial communities directly from metagenomic data
https://franciscozorrilla.github.io/metaGEM/
MIT License
189 stars 41 forks source link

Error parsing number of cores (--cores, -c, -j): must be integer, empty, or 'all'. #103

Closed CynthiaChibani closed 2 years ago

CynthiaChibani commented 2 years ago

Hey Francisco, i keep getting an error when i try to submit my job to our clusters. Here is an example of my cluster_config.json if you can have a quick look.

{
"__default__" : {
        "account" : "SOMENAME",
        "time" : "0-48:00:00",
        "n" : 48,
        "tasks" : 1,
        "mem" : 180G,
        "name": "DL.{rule}",
        "output": "logs/{wildcards}.%N.{rule}.out.log"
},
}

I kind of followed what you had detailed ehere: https://github.com/franciscozorrilla/metaGEM/wiki/Cluster-config and i am trying to run a command example: bash metaGEM.sh -t binRefine -j 2 -c 48 -m 180 -h 48 thanks in advance for your input.

franciscozorrilla commented 2 years ago

Hi Cynthia,

Just to double check, is your cluster account username SOMENAME? Or did you simply redact your username for privacy? You will not be able to submit jobs to your cluster without a valid account.

Best, Francisco

CynthiaChibani commented 2 years ago

oh i just replaced my username just for privacy

franciscozorrilla commented 2 years ago

OK thanks for clarifying, it looks like the metaGEM.sh parser is malfunctioning for some reason and failing to pass on the -j/nJobs argument to Snakemake. My first instinct would be to try submitting jobs "manually" without using the parser for troubleshooting: could you try running the following command?

nohup snakemake all -j 2 -k --cluster-config cluster_config.json -c "sbatch -A {cluster.account} --mem {cluster.mem} -t {cluster.time} -n {cluster.n} --ntasks {cluster.tasks} --cpus-per-task {cluster.n} --output {cluster.output}" &

Please let me know if this resolves your issue! Best, Francisco

CynthiaChibani commented 2 years ago

Dear Francisco. I get the same error, unfortunately. best, Cynthia

franciscozorrilla commented 2 years ago

What version of Snakemake are you running? E.g. conda list|grep snakemake

Snakemake is under constant development and it seems that some changes in the invocation/parameters have been introduced in recent versions. Could you try downgrading Snakemake to v5.10.0 as specified in the methods section of the metaGEM publication?

Also is your cluster slurm-based?

CynthiaChibani commented 2 years ago

Hey Francisco, yes our cluster is slurm-based. i had snakemake v7.2.1 automatically downloaded. downgrading it to v5.10 seems to solve the issue until now at least :D. thanks again :):)

franciscozorrilla commented 2 years ago

Great, thanks for the info Cynthia! I will note this version spcification in the installation section. I will close this issue for now but please re-open/create a new issue if you have further problems.

Best, Francisco

tan-yuwei commented 2 years ago

Hi Francisco, How can I downgrade snakemake version properly? I directly replaced the automatically downloaded snakemake file in envs/metaGEM by another snakemak file. However, same error still happened.

Thanks in advance!

Best regards, Yuwei

franciscozorrilla commented 2 years ago

Hi Yuwei,

Please refer to the conda documentation for package management instructions. You can remove the snakemake package from your metaGEM env and then install the specfic version you want, alternatively you can start a fresh new environment and specify the downgraded vesion of snakemake (see documentation). With this commit https://github.com/franciscozorrilla/metaGEM/commit/14d0130a1874c2bd4376319b348d9ce466f31aee I have now updated the metaGEM conda specification file to force the installation of the correct version of Snakemake.

Best, Francisco

tan-yuwei commented 2 years ago

Hi Francisco,

Thanks for your reply and your help. The error about parsing is solved this time, but another error unfortunately happened. My situantion is: I reinstalled miniconda as base {/home/yw/miniconda3}. I reinstalled the updated metaGEM at /home/yw/yw. Then, I activated environment by the commond {source activate ' /home/yw/yw/metaGEM/envs/metagem } Next, I tried to run a commond {bash metaGEM.sh -t fastp -j 3 -c 2 -m 20 -h 2} with three provided samples data, but unfortunately failed. The "nohup.out" showed


Building DAG of jobs... Using shell: /usr/bin/bash Provided cluster nodes: 3 Conda environments: ignored Job counts: count jobs 1 all 3 qfilter 4 Select jobs to execute...

[Sun Jun 12 18:36:21 2022] rule qfilter: input: /home/yw/yw/metaGEM/dataset/sample1/sample1_R1.fastq.gz, /home/yw/yw/metaGEM/dataset/sample1/sample1_R2.fastq.gz output: /home/yw/yw/metaGEM/qfiltered/sample1/sample1_R1.fastq.gz, /home/yw/yw/metaGEM/qfiltered/sample1/sample1_R2.fastq.gz jobid: 1 wildcards: IDs=sample1

/bin/sh: 1: sbatch: not found Error submitting jobscript (exit code 127): Job failed, going on with independent jobs.


Looking forward your reply,

Best regards, Yuwei

franciscozorrilla commented 2 years ago

Looks like you are missing the sbatch command. What setup are you using to run metaGEM? Are you running it on a HPC?

tan-yuwei commented 2 years ago

Hi Francisco,

I installed the metaGEM in the unbantu that setup by VirtualBox on my laptop.

Best, Yuwei

franciscozorrilla commented 2 years ago

Due to the computational demands required for assembly of metagenomes, it is generally not feasible to run a realistic metegenomics analysis on your laptop. For testing, e.g. running on subsampled/toy data, you may run locally on your laptop by using the --local flag. As an example, see the google colab notebook where you can find code specifically in section 5) for qfiltering:

bash metaGEM.sh --task fastp --local

and section 6) for assembly:

bash metaGEM.sh --task megahit --local

Note that this toy dataset consists of heavily downsampled reads. Please open a new issue if your questions are unrelated to the original post.