Closed koushik20 closed 2 years ago
Hi, This is a memory issue. You should increase the resource for this job. By default, it is running with 12GB of RAM which seems to not be enough in your case.
I would suggest to create your own config file and to change the RAM for this process ;
process {
withName:remove_duplicates {
memory = 40.Gb
}
}
see https://nf-co.re/usage/configuration#custom-configuration-files
btw, what was the issue with v1.3.0 ? was it a conda issue ?
Thank you for your suggestions the pipeline is successfully completed.
When running with version1.3.0 I am getting the following error
Execution cancelled -- Finishing pending tasks before exit
- Ignore this warning: params.schema_ignore_params = "saveReference,splitFastq"
WARN: Found unexpected parameters:
* --saveReference: true
* --splitFastq: 10000000
Error executing process > 'bowtie2_end_to_end (HiChIP_MCF10A_R1)'
Caused by:
Process exceeded running time limit (8h)
Command executed:
INDEX=`find -L ./ -name "*.rev.1.bt2" | sed 's/.rev.1.bt2//'`
bowtie2 --rg-id BMG --rg SM:HiChIP_MCF10A-A_S7_R1_001 \
--very-sensitive --end-to-end --reorder \
-p 4 \
-x ${INDEX} \
--un HiChIP_MCF10A-A_S7_R1_001_unmap.fastq \
-U HiChIP_MCF10A-A_S7_R1_001.fastq.gz | samtools view -F 4 -bS - > HiChIP_MCF10A-A_S7_R1_001.bam
Command exit status:
-
Command output:
(empty)
Work dir:
/mnt/hichip_fastq/work/f9/f391d404fdf5768da48979b926096d
Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out`
Hello,
I am running an nf-core/hic pipeline for breast samples (Total 6 samples including replicates) 500 Million reads each. I am getting the following error
The following is the script I ran
I tried to run with version 1.3.0 but the pipeline couldn't able to complete the bowtie2 end-to-end process so I am running with version 1.0.0.
Any thoughts?