nf-core / sarek

Analysis pipeline to detect germline or somatic variants (pre-processing, variant calling and annotation) from WGS / targeted sequencing
https://nf-co.re/sarek
MIT License
351 stars 386 forks source link

Java applications request too much memory #1332

Open rrlove-cdc opened 7 months ago

rrlove-cdc commented 7 months ago

Description of the bug

The sarek pipeline repeatedly errors out at the GATK MarkDuplicates step with the message:

"Error occurred during initialization of VM Could not reserve enough space for 50331648KB object heap"

Sometimes an error message for "FASTQ_ALIGN_BWAMEM_MEM2_DRAGMAP_SENTIEON:BWAMEM1_MEM" also appears.

With help from our HPC support team, I have tried:

In all cases, GATK still tries to request a larger heap size, so the custom settings don't seem to be getting passed to the program. A member of the HPC support team suggested the issue might be solvable by adding 'ext.args = "-Xmx8g"' in the module-level config files rather than the user-level config file.

Command used and terminal output

nextflow run nf-core/sarek \
--genome null \
--igenomes_ignore \
--fasta ${ref} \
--bwamem2 ${basedir}refs/ \
--dict ${basedir}refs/VectorBase-65_AfunestusFUMOZ_Genome.dict \
--fasta_fai ${basedir}refs/VectorBase-65_AfunestusFUMOZ_Genome.fasta.fai \
--input ${basedir}metadata/sarek_test_metadata.csv \
--outdir . \
--joint_germline \
--skip_tools baserecalibrator \
--tools haplotypecaller \
-profile sge,singularity \
-c ${basedir}custom_resources_w_markduplicates.conf \
-resume

ERROR ~ Error executing process > 'NFCORE_SAREK:SAREK:BAM_MARKDUPLICATES:GATK4_MARKDUPLICATES (Ken4590)'

Caused by:
  Process `NFCORE_SAREK:SAREK:BAM_MARKDUPLICATES:GATK4_MARKDUPLICATES (Ken4590)` terminated with an error exit status (1)

Command executed:

  gatk --java-options "-Xmx49152M -XX:-UsePerfData" \
      MarkDuplicates \
      --INPUT Ken4590-1.0012.bam --INPUT Ken4590-1.0005.bam --INPUT Ken4590-1.0003.bam --INPUT Ken4590-1.0009.bam --INPUT Ken4590-1.0004.bam --INPUT Ken4590-1.0002.bam --INPUT Ken4590-1.0001.bam --INPUT Ken4590-1.0011.bam --INPUT Ken4590-1.0008.bam --INPUT Ken4590-1.0007.bam --INPUT Ken4590-1.0010.bam --INPUT Ken4590-1.0006.bam \
      --OUTPUT Ken4590.md.bam \
      --METRICS_FILE Ken4590.md.cram.metrics \
      --TMP_DIR . \      
--REFERENCE_SEQUENCE VectorBase-65_AfunestusFUMOZ_Genome.fasta \
      -REMOVE_DUPLICATES false -VALIDATION_STRINGENCY LENIENT

  # If cram files are wished as output, the run samtools for conversion
  if [[ Ken4590.md.cram == *.cram ]]; then
      samtools view -Ch -T VectorBase-65_AfunestusFUMOZ_Genome.fasta -o Ken4590.md.cram Ken4590.md.bam
      rm Ken4590.md.bam
      samtools index Ken4590.md.cram
  fi

  cat <<-END_VERSIONS > versions.yml
  "NFCORE_SAREK:SAREK:BAM_MARKDUPLICATES:GATK4_MARKDUPLICATES":
      gatk4: $(echo $(gatk --version 2>&1) | sed 's/^.*(GATK) v//; s/ .*$//')
      samtools: $(echo $(samtools --version 2>&1) | sed 's/^.*samtools //; s/Using.*$//')
  END_VERSIONS

Command exit status:
  1

Command output:
  Error occurred during initialization of VM
  Could not reserve enough space for 50331648KB object heap

Command error:
  Using GATK jar /usr/local/share/gatk4-4.4.0.0-0/gatk-package-4.4.0.0-local.jar
  Running:
      java -Dsamjdk.use_async_io_read_samtools=false -Dsamjdk.use_async_io_write_samtools=true -Dsamjdk.use_async_io_write_tribble=false -Dsamjdk.compression_level=2 -Xmx49152M -XX:-UsePerfData -jar /usr/local/share/gatk4-4.4.0.0-0/gatk-package-4.4.0.0-local.jar MarkDuplicates --INPUT Ken4590-1.0012.bam --INPUT Ken4590-1.0005.bam --INPUT Ken4590-1.0003.bam --INPUT Ken4590-1.0009.bam --INPUT Ken4590-1.0004.bam --INPUT Ken4590-1.0002.bam --INPUT Ken4590-1.0001.bam --INPUT Ken4590-1.0011.bam --INPUT Ken4590-1.0008.bam --INPUT Ken4590-1.0007.bam --INPUT Ken4590-1.0010.bam --INPUT Ken4590-1.0006.bam --OUTPUT Ken4590.md.bam --METRICS_FILE Ken4590.md.cram.metrics --TMP_DIR . 
--REFERENCE_SEQUENCE VectorBase-65_AfunestusFUMOZ_Genome.fasta -REMOVE_DUPLICATES false -VALIDATION_STRINGENCY LENIENT

Work dir:
  <path>/response/test/work/2f/47a0d3b493240c590a3540d1a88bd5

Tip: when you have fixed the problem you can continue the execution adding the option `-resume` to the run command line

 -- Check '.nextflow.log' file for details

Relevant files

nextflow.log.zip sarek_test.zip

System information

Nextflow version 23.04.1, nf-core/sarek v3.3.2-gf034b73 HPC + singularity on SGE CentOS

Truongphikt commented 1 month ago

@rrlove-cdc Did you solve this problem? I run into the same bug as yours in NFCORE_SAREK:SAREK:FASTQC. I will be grateful if you share information. Thanks.

[nf-core/sarek] Pipeline completed with errors-
ERROR ~ Error executing process > 'NFCORE_SAREK:SAREK:FASTQC (test-test_L2)'

Caused by:
  Process `NFCORE_SAREK:SAREK:FASTQC (test-test_L2)` terminated with an error exit status (1)

Command executed:

  printf "%s %s\n" test_1.fastq.gz test-test_L2_1.gz test_2.fastq.gz test-test_L2_2.gz | while read old_name new_name; do
      [ -f "${new_name}" ] || ln -s $old_name $new_name
  done

  fastqc \
      --quiet \
      --threads 2 \
      --memory 6656 \
      test-test_L2_1.gz test-test_L2_2.gz

  cat <<-END_VERSIONS > versions.yml
  "NFCORE_SAREK:SAREK:FASTQC":
      fastqc: $( fastqc --version | sed '/FastQC v/!d; s/.*v//' )
  END_VERSIONS

Command exit status:
 1

Command output:
  Error occurred during initialization of VM
  Could not reserve enough space for 13631488KB object heap

Command wrapper:
  Error occurred during initialization of VM
  Could not reserve enough space for 13631488KB object heap

.nextflow.log