jdblischak / smk-simple-slurm

A simple Snakemake profile for Slurm without --cluster-config
Creative Commons Zero v1.0 Universal
131 stars 16 forks source link

Snakemake is not recognizing arguments #22

Closed satyanarayan-rao closed 8 months ago

satyanarayan-rao commented 8 months ago

Hi @jdblischak,

Many thanks for creating a simple profile for SLURM. I am facing issues with using this profile though:

I am using snakemake version 8.4.2. On Snakemake documentation, I learned that config file should be names "config.v8+.yaml". I named accordingly and here the content.

cat simple/config.v8+.yaml

cluster:
  mkdir -p logs/{rule} &&
  sbatch
    --partition={resources.partition}
    --cpus-per-task={threads}
    --mem={resources.mem_mb}
    --job-name=smk-{rule}-{wildcards}
    --output=logs/{rule}/{rule}-{wildcards}-%j.out
default-resources:
  - partition=small
  - mem_mb=1000
  - threads=48

restart-times: 3
max-jobs-per-second: 10
max-status-checks-per-second: 1
local-cores: 1
latency-wait: 60
jobs: 500
keep-going: True
rerun-incomplete: True
printshellcmds: True
scheduler: greedy
use-conda: True

On running snakemake command, I get the following error:

snakemake  -np --snakefile dl.smk  fastq_from_geo/SRR8245078_1.fastq.gz --profile simple/

snakemake: error: unrecognized arguments: --cluster=mkdir -p logs/{rule} && sbatch --partition={resources.partition} --cpus-per-task={threads} --mem={resources.mem_mb} --job-name=smk-{rule}-{wildcards} --output=logs/{rule}/{rule}-{wildcards}-%j.out 

Just the dry run command works fine:

snakemake  -np --snakefile dl.smk  fastq_from_geo/SRR8245078_1.fastq.gz
Building DAG of jobs...
Job stats:
job                    count
-------------------  -------
download_using_curl        1
total                      1

Execute 1 jobs...

[Fri Feb  2 19:18:39 2024]
localrule download_using_curl:
    output: fastq_from_geo/SRR8245078_1.fastq.gz
    jobid: 0
    reason: Missing output files: fastq_from_geo/SRR8245078_1.fastq.gz
    wildcards: srr_id=SRR8245078_1
    resources: tmpdir=<TBD>

sh scripts/download.sh SRR8245078_1 metadata/srr_to_ftp_url.tsv fastq_from_geo/SRR8245078_1.fastq.gz
Job stats:
job                    count
-------------------  -------
download_using_curl        1
total                      1

Reasons:
    (check individual jobs above for details)
    missing output files:
        download_using_curl

This was a dry-run (flag -n). The order of jobs does not reflect the order of execution.

Any help would be greatly appreciated! Please let me know if you need more information.

Thank you very much, Satya

jdblischak commented 8 months ago

@satyanarayan-rao Snakemake 8 completely changed the interface to Slurm. Both this profile (#21) and the official slurm profile (https://github.com/Snakemake-Profiles/slurm/issues/117) were broken by these changes. You have 3 options:

  1. Downgrade to snakemake 7 and continue to use this profile (this is the quickest option)
  2. Determine how to adapt this profile to use the new Snakemake 8 behavior (see #21 for ideas for getting started)
  3. Switch to using the new Slurm submission that is built-in to Snakemake 8 (note that this will require you to give up some control, eg on where the log files are saved, as discussed in https://github.com/snakemake/snakemake-executor-plugin-slurm/issues/11)
satyanarayan-rao commented 8 months ago

Thanks a lot, @jdblischak! I had to downgrade to snakemake 6 and use --cluster-config option. I think I would wait for the stable version of the new Snakemake to migrate. Too much hassle I guess right now, or maybe I am making stupid mistakes.

jdblischak commented 6 months ago

I had to downgrade to snakemake 6 and use --cluster-config option.

I'm glad you got it working, but to clarify, I don't recommend using --cluster-config. It's been deprecated for years

Too much hassle I guess right now, or maybe I am making stupid mistakes.

If you used the smk-simple-slurm profile in this repo, it should have worked fine with Snakemake 7. If you're interested in giving it a try, and you still get errors, please feel free to open a new Issue to share the error message and your profile config file

I think I would wait for the stable version of the new Snakemake to migrate.

I totally understand the frustration. One option would be to migrate from --cluster-config directly to the new Snakemake 8 cluster support, which would completely bypass the stage that this profile was meant to address.

Though I wouldn't get your hopes up about a mythical stable version of Snakemake. As long as it is actively developed (which is generally a good thing), it will continue to change. I have lots of old pipelines that use --cluster-config as well as plenty that use smk-simple-slurm. As long as you pin the version of snakemake in your conda env, your pipeline should continue to run.