open2c / distiller-nf

A modular Hi-C mapping pipeline
MIT License
85 stars 24 forks source link

Error: Process `map_parse_sort_chunks (library:HSPR2 run:lane1 chunk:0)` terminated with an error exit status (140) #166

Closed hzjsxu closed 3 years ago

hzjsxu commented 3 years ago

Hi,

I met some problems when using distiller-nf pipeline. Could someone help me please?

Here is the info,I use the pipeline without docker or singularity.

Thanks.

# LSBATCH: User input
nextflow /work/bio-xujs/biosoft/distiller-nf/distiller.nf -params-file human_Sperm.project.yml -profile cluster -without-docker

------------------------------------------------------------

Exited with exit code 1.

Resource usage summary:

    CPU time :                                   62.22 sec.
    Max Memory :                                 1652 MB
    Average Memory :                             1166.18 MB
    Total Requested Memory :                     -
    Delta Memory :                               -
    Max Swap :                                   -
    Max Processes :                              7
    Max Threads :                                68
    Run time :                                   43239 sec.
    Turnaround time :                            43285 sec.

The output (if any) follows:

N E X T F L O W  ~  version 19.01.0
Launching `/work/bio-xujs/biosoft/distiller-nf/distiller.nf` [grave_cajal] - revision: 0edeab7502
WARN: There's no process matching config selector: download_sra
[warm up] executor > lsf
[ee/551a4e] Submitted process > map_parse_sort_chunks (library:HSPR3 run:lane1 chunk:0)
[81/bfc866] Submitted process > map_parse_sort_chunks (library:HSPR1 run:lane1 chunk:0)
[ac/71f7d2] Submitted process > map_parse_sort_chunks (library:HSPR2 run:lane1 chunk:0)
[ee/551a4e] NOTE: Process `map_parse_sort_chunks (library:HSPR3 run:lane1 chunk:0)` terminated with an error exit status (140) -- Exec
ution is retried (1)
[81/bfc866] NOTE: Process `map_parse_sort_chunks (library:HSPR1 run:lane1 chunk:0)` terminated with an error exit status (140) -- Execution is retried (1)
[3d/b68bd0] NOTE: Process `map_parse_sort_chunks (library:HSPR3 run:lane1 chunk:0)` terminated with an error exit status (140) -- Execution is retried (2)
[9e/c0337d] NOTE: Process `map_parse_sort_chunks (library:HSPR1 run:lane1 chunk:0)` terminated with an error exit status (140) -- Execution is retried (2)
[eb/161367] Re-submitted process > map_parse_sort_chunks (library:HSPR3 run:lane1 chunk:0)
[83/b0337d] Re-submitted process > map_parse_sort_chunks (library:HSPR1 run:lane1 chunk:0)
[c1/0f7b55] NOTE: Process `map_parse_sort_chunks (library:HSPR2 run:lane1 chunk:0)` terminated with an error exit status (140) -- Execution is retried (2)
[3b/c64083] Re-submitted process > map_parse_sort_chunks (library:HSPR2 run:lane1 chunk:0)
ERROR ~ Error executing process > 'map_parse_sort_chunks (library:HSPR3 run:lane1 chunk:0)'

Caused by:
  Process `map_parse_sort_chunks (library:HSPR3 run:lane1 chunk:0)` terminated with an error exit status (140)

Command executed:

  TASK_TMP_DIR=$(mktemp -d -p ./ distiller.tmp.XXXXXXXXXX)
  touch HSPR3.lane1.hg19.0.bam

  bwa mem -t 40  -SP hg19.fa         HSPR3_1.fq.gz HSPR3_2.fq.gz      | pairtools parse --drop-sam --drop-readid        --add-columns 
mapq       -c hg19.chrom.size       | pairtools sort --nproc 40                      -o HSPR3.lane1.hg19.0.pairsam.gz                 
     --tmpdir $TASK_TMP_DIR       | cat

  rm -rf $TASK_TMP_DIR

Command exit status:
  140

Command output:
  (empty)

Command error:
  [M::mem_pestat] # candidate unique pairs for (FF, FR, RF, RR): (1616, 850715, 1094, 1385)
  [M::mem_pestat] analyzing insert size distribution for orientation FF...
  [M::mem_pestat] (25, 50, 75) percentile: (659, 2678, 5616)
  [M::mem_pestat] low and high boundaries for computing mean and std.dev: (1, 15530)
  [M::mem_pestat] mean and std.dev: (3363.14, 2878.99)
  [M::mem_pestat] low and high boundaries for proper pairs: (1, 20487)
  [M::mem_pestat] analyzing insert size distribution for orientation FR...
  [M::mem_pestat] (25, 50, 75) percentile: (274, 343, 426)
  [M::mem_pestat] low and high boundaries for computing mean and std.dev: (1, 730)
  [M::mem_pestat] mean and std.dev: (353.36, 116.00)
  [M::mem_pestat] low and high boundaries for proper pairs: (1, 882)
  [M::mem_pestat] analyzing insert size distribution for orientation RF...
  [M::mem_pestat] (25, 50, 75) percentile: (1564, 3673, 6637)
  [M::mem_pestat] low and high boundaries for computing mean and std.dev: (1, 16783)
  [M::mem_pestat] mean and std.dev: (4150.55, 2917.57)
  [M::mem_pestat] low and high boundaries for proper pairs: (1, 21856)
  [M::mem_pestat] analyzing insert size distribution for orientation RR...
  [M::mem_pestat] (25, 50, 75) percentile: (1164, 2982, 5928)
  [M::mem_pestat] low and high boundaries for computing mean and std.dev: (1, 15456)
  [M::mem_pestat] mean and std.dev: (3730.98, 2907.65)
  [M::mem_pestat] low and high boundaries for proper pairs: (1, 20220)
  [M::mem_pestat] skip orientation FF
  [M::mem_pestat] skip orientation RF

Work dir:
  /scratch/2021-03-01/bio-xujs/hic/human/work/eb/161367217e494646aea3963151af07

Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out`

 -- Check '.nextflow.log' file for details
WARN: Killing pending tasks (2)
Phlya commented 3 years ago

Exit status 140 points to not enough resources for the job on the cluster, probably memory. You can check how much memory the processes used in the report in pipeline_info folder (or smth like that). You probably need to increase the memory allocation in the profile file.

hzjsxu commented 3 years ago

Exit status 140 points to not enough resources for the job on the cluster, probably memory. You can check his much memory the processes used in the report in pipeline_info folder (or smth like that). You probably need to increase the memory allocation in the profile file.

    I have tried to increase the memory allocation, but it doesn't seem to work. Where can I check the much memory the processes used in the report.html ? Is this in the picture? image

    If so, I have increased the memory to 300GB, It also runs unsuccessfully. image

Phlya commented 3 years ago

What about the time limit then?

hzjsxu commented 3 years ago

What about the time limit then?

ok! It works after I increase the time limit. Thanks!