ablab / spades

SPAdes Genome Assembler
http://ablab.github.io/spades/
Other
737 stars 134 forks source link

err code -9 #1122

Open cpouchon opened 1 year ago

cpouchon commented 1 year ago

Description of bug

Dear developer and community,

I am reporting an error -9 that I get when I try to assemble shotgun sequences for Myxomycetes.

I have no problems assembling sequences on all my other runs but I always get this -9 error for high size libraries.

Here I have more than 35M reads. I have the same thing for plant sequences (and on another computing infrastructure and for a version of spades not packaged in conda) for libraries that have more than 25M reads.

I already tried to request nodes with more RAM but it crashes in the same way (same error reported).

Do you know how I could solve this error?

Thank you very much,

all the best

Charles

spades.log

Command line: /home/users/p/pouchon/.conda/envs/refmaker-env/bin/spades.py -1 /home/users/p/pouchon/Myxomycetes_refmaker/data/DR4_forward_paired.fq.gz -2 /home/users/p/pouchon/Myxomycetes_refmaker/data/DR4_reverse_paired.fq.gz --cov-cutoff auto -o /home/users/p/pouchon/Myxomycetes_refmaker/assembly/spades/Cribraria_tenella_1334524_DR4 -t 32 -m 330 -k 55

System information: SPAdes version: 3.13.1 Python version: 3.8.8 OS: Linux-3.10.0-1160.80.1.el7.x86_64-x86_64-with-glibc2.10

Output dir: /home/users/p/pouchon/Myxomycetes_refmaker/assembly/spades/Cribraria_tenella_1334524_DR4 Mode: read error correction and assembling Debug mode is turned OFF

Dataset parameters: Multi-cell mode (you should set '--sc' flag if input data was obtained with MDA (single-cell) technology or --meta flag if processing metagenomic dataset) Reads: Library number: 1, library type: paired-end orientation: fr left reads: ['/home/users/p/pouchon/Myxomycetes_refmaker/data/DR4_forward_paired.fq.gz'] right reads: ['/home/users/p/pouchon/Myxomycetes_refmaker/data/DR4_reverse_paired.fq.gz'] interlaced reads: not specified single reads: not specified merged reads: not specified Read error correction parameters: Iterations: 1 PHRED offset will be auto-detected Corrected reads will be compressed Assembly parameters: k: [55] Repeat resolution is enabled Mismatch careful mode is turned OFF MismatchCorrector will be SKIPPED Coverage cutoff is turned ON and threshold will be auto-detected Other parameters: Dir for temp files: /home/users/p/pouchon/Myxomycetes_refmaker/assembly/spades/Cribraria_tenella_1334524_DR4/tmp Threads: 32 Memory limit (in Gb): 330

======= SPAdes pipeline started. Log can be found here: /home/users/p/pouchon/Myxomycetes_refmaker/assembly/spades/Cribraria_tenella_1334524_DR4/spades.log

===== Read error correction started.

== Running read error correction tool: /home/users/p/pouchon/.conda/envs/refmaker-env/share/spades-3.13.1-0/bin/spades-hammer /home/users/p/pouchon/Myxomycetes_refmaker/assembly/spades/Cribraria_tenella_1334524_DR4/corrected/configs/config.info

0:00:00.000 4M / 4M INFO General (main.cpp : 75) Starting BayesHammer, built from refs/heads/spades_3.13.1, git revision 9a9d54db2ff9abaac718155bf74c12ec9464e8ca 0:00:00.000 4M / 4M INFO General (main.cpp : 76) Loading config from /home/users/p/pouchon/Myxomycetes_refmaker/assembly/spades/Cribraria_tenella_1334524_DR4/corrected/configs/config.info 0:00:00.009 4M / 4M INFO General (main.cpp : 78) Maximum # of threads to use (adjusted due to OMP capabilities): 32 0:00:00.010 4M / 4M INFO General (memory_limit.cpp : 49) Memory limit set to 330 Gb 0:00:00.010 4M / 4M INFO General (main.cpp : 86) Trying to determine PHRED offset 0:00:00.059 4M / 4M INFO General (main.cpp : 92) Determined value is 33 0:00:00.060 4M / 4M INFO General (hammer_tools.cpp : 36) Hamming graph threshold tau=1, k=21, subkmer positions = [ 0 10 ] 0:00:00.060 4M / 4M INFO General (main.cpp : 113) Size of aux. kmer data 24 bytes === ITERATION 0 begins === 0:00:00.062 4M / 4M INFO K-mer Index Building (kmer_index_builder.hpp : 301) Building kmer index 0:00:00.062 4M / 4M INFO General (kmer_index_builder.hpp : 117) Splitting kmer instances into 512 files using 32 threads. This might take a while. 0:00:00.062 4M / 4M INFO General (file_limit.hpp : 32) Open file limit set to 1024 0:00:00.062 4M / 4M INFO General (kmer_splitters.hpp : 89) Memory available for splitting buffers: 3.43746 Gb 0:00:00.062 4M / 4M INFO General (kmer_splitters.hpp : 97) Using cell size of 131072 0:00:02.372 17G / 17G INFO K-mer Splitting (kmer_data.cpp : 97) Processing /home/users/p/pouchon/Myxomycetes_refmaker/data/DR4_forward_paired.fq.gz 0:00:19.860 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 2355267 reads 0:00:36.417 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 4697031 reads 0:00:52.999 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 6993311 reads 0:01:09.709 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 9298832 reads 0:01:26.812 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 11655030 reads 0:01:44.195 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 13923605 reads 0:02:02.081 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 16246764 reads 0:02:18.987 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 18576632 reads 0:02:35.778 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 20934085 reads 0:02:54.419 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 23279345 reads 0:04:20.803 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 35072873 reads 0:08:30.461 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 68066851 reads 0:17:10.610 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 135559748 reads 0:21:35.310 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 97) Processing /home/users/p/pouchon/Myxomycetes_refmaker/data/DR4_reverse_paired.fq.gz 0:33:58.682 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 107) Processed 269267395 reads 0:43:01.310 17G / 18G INFO K-mer Splitting (kmer_data.cpp : 112) Total 339926882 reads processed 0:43:02.792 128M / 18G INFO General (kmer_index_builder.hpp : 120) Starting k-mer counting. 0:46:20.197 128M / 18G INFO General (kmer_index_builder.hpp : 127) K-mer counting done. There are 2984406844 kmers in total. 0:46:20.197 128M / 18G INFO General (kmer_index_builder.hpp : 133) Merging temporary buckets. 0:46:48.668 128M / 18G INFO K-mer Index Building (kmer_index_builder.hpp : 314) Building perfect hash indices 0:48:24.891 1G / 18G INFO General (kmer_index_builder.hpp : 150) Merging final buckets. 0:48:42.222 1G / 18G INFO K-mer Index Building (kmer_index_builder.hpp : 336) Index built. Total 1383907922 bytes occupied (3.7097 bits per kmer). 0:48:42.227 1G / 18G INFO K-mer Counting (kmer_data.cpp : 356) Arranging kmers in hash map order 0:50:11.153 46G / 46G INFO General (main.cpp : 148) Clustering Hamming graph. 1:24:26.625 46G / 46G INFO General (main.cpp : 155) Extracting clusters 1:44:38.978 46G / 85G INFO General (main.cpp : 167) Clustering done. Total clusters: 450465992 1:44:39.082 23G / 85G INFO K-mer Counting (kmer_data.cpp : 376) Collecting K-mer information, this takes a while.

== Error == system call for: "['/home/users/p/pouchon/.conda/envs/refmaker-env/share/spades-3.13.1-0/bin/spades-hammer', '/home/users/p/pouchon/Myxomycetes_refmaker/assembly/spades/Cribraria_tenella_1334524_DR4/corrected/configs/config.info']" finished abnormally, err code: -9

In case you have troubles running SPAdes, you can write to spades.support@cab.spbu.ru or report an issue on our GitHub repository github.com/ablab/spades

params.txt

SPAdes assembly is running within my in-house pipeline, here I pasted only my slurm job parameters.

!/bin/sh

SBATCH --job-name refmaker_myxo

SBATCH --error assembly-error.e%j

SBATCH --output assembly-out.o%j

SBATCH --ntasks 1

SBATCH --cpus-per-task 32

SBATCH --partition shared-bigmem

SBATCH --time 12:00:00

THREADS=32 MEMORY=330
KMER=55

SPAdes version

SPAdes v3.13.1

Operating System

hpc cluster University of Geneva

Python Version

Python 3.9.6

Method of SPAdes installation

conda

No errors reported in spades.log

asl commented 1 year ago

Hello

Your SPAdes job was killed by some external process. Likely more information could be found in the system log / from your system administrator

PS: I would also suggest you to upgrade to the latest version of SPAdes.