Closed neusmf closed 4 months ago
I could solve it. On the one hand, I was not counting the RAM memory as it should be and on the other hand, after changing the work directory, the results are copied instead of having symbolic links.
Sorry for bothering, Neus
Hi and thanks for the pipeline! I am trying to run it but I do not manage to change the params through my config file.
Here my command to run the pipeline: nextflow run goodwright/clipseq -profile singularity -bg -w /nfs/scratch01/work_iclip -c custom_crg.config \ --samplesheet demultiplexed_samples.csv \ --fasta /db/ensembl/release-95/homo_sapiens/genome/Homo_sapiens.GRCh38.dna.fa.gz \ --smrna_fasta homosapiens_smallRNA.fa.gz \ --gtf /db/ensembl/release-95/homo_sapiens/gtf/Homo_sapiens.GRCh38.95.gtf.gz
Here the lines from my config file:
params { // Boilerplate options publish_dir_mode = 'copy'
}
When I check the log it seems fine:
Core pipeline options outdir : ./results tracedir : ./results/pipeline_info publish_dir_mode : copy max_memory : 120 GB max_cpus : 12 max_time : 20d 20h
But then all the jobs submitted have h_vmem=122880M,virtual_free=122880M, which are the default from the pipeline, not the maximum I specified. Also, I get symbolic links instead of copied files in the results folder.
What do you suggest to do?
Thanks!