jtamames / SqueezeMeta

A complete pipeline for metagenomic analysis
GNU General Public License v3.0
348 stars 81 forks source link

Error with SqueezeMeta STEP 1 #674

Closed avinashDhar1993 closed 1 year ago

avinashDhar1993 commented 1 year ago

Hi All, I got the following error while running SqueezeMeta - Stopping in STEP1 -> 01.merge_assemblies.pl. Program finished abnormally. (Error in /home/avinash/SqueezeMeta/scripts/01.run_all_assemblies.pl, line number 233). I have attached the syslog file for your further reference). Thanks in advance for your help. Link for the syslog file. https://easyupload.io/d0s55u

fpusan commented 1 year ago

Your syslog indicates that mummer is segfaulting Command: LD_LIBRARY_PATH=/home/avinash/SqueezeMeta/bin/AMOS/../../lib/mummer /home/avinash/SqueezeMeta/bin/AMOS/../mummer/nucmer --maxmatch --threads 12 -c 100 /mnt/e/1_clean_data/Aerobiome/temp/mergedassemblies.Aerobiome.99.ref.seq /mnt/e/1_clean_data/Aerobiome/temp/mergedassemblies.Aerobiome.99.qry.seq -p /mnt/e/1_clean_data/Aerobiome/temp/mergedassemblies.Aerobiome.99 exited with status: 139 How much memory do you have available?

avinashDhar1993 commented 1 year ago

Your syslog indicates that mummer is segfaulting Command: LD_LIBRARY_PATH=/home/avinash/SqueezeMeta/bin/AMOS/../../lib/mummer /home/avinash/SqueezeMeta/bin/AMOS/../mummer/nucmer --maxmatch --threads 12 -c 100 /mnt/e/1_clean_data/Aerobiome/temp/mergedassemblies.Aerobiome.99.ref.seq /mnt/e/1_clean_data/Aerobiome/temp/mergedassemblies.Aerobiome.99.qry.seq -p /mnt/e/1_clean_data/Aerobiome/temp/mergedassemblies.Aerobiome.99 exited with status: 139 How much memory do you have available?

Thanks for your reply. Well I am running this on WSL2 on a windows 11 system. I have around 426 GB of available memory and 64 GB RAM on my system.

fpusan commented 1 year ago

Can you try to reproduce the issue in a ubuntu VM? WSL2 is working quite well nowadays, but one never knows. Alternatively try to download and recompile mummer in WSl2, then use that binary to replace the one we ship with SqueezeMeta

avinashDhar1993 commented 1 year ago

Hi Fpusan

Thanks a lot for your response. I reduced the number of samples and the above-mentioned error got resolved. However, there is a new problem now. The pipeline gets stuck at step 6 and it doesn't proceed at all. I am copying the step below -

SqueezeMeta v1.6.2, March 2023 - (c) J. Tamames, F. Puente-Sánchez CNB-CSIC, Madrid, SPAIN Please cite: Tamames & Puente-Sanchez, Frontiers in Microbiology 9, 3349 (2019). doi: https://doi.org/10.3389/fmicb.2018.03349 Run started Thu May 4 10:34:08 2023 in merged mode 7 metagenomes found: SC9551 SC9552 SC9553 SC9555 SC9556 SC9559 SC9560 Contig file /mnt/e/1_clean_data/raw/chile/Aero_chile/results/01.Aero_chile.fasta already found, skipping step 1 RNA gff file /mnt/e/1_clean_data/raw/chile/Aero_chile/intermediate/02.Aero_chile.maskedrna.fasta already found, skipping step 2 Aminoacid file /mnt/e/1_clean_data/raw/chile/Aero_chile/results/03.Aero_chile.faa already found, skipping step 3 Diamond file /mnt/e/1_clean_data/raw/chile/Aero_chile/intermediate/04.Aero_chile.nr.diamond already found, skipping step 4 Pfam file /mnt/e/1_clean_data/raw/chile/Aero_chile/intermediate/05.Aero_chile.pfam.hmm already found, skipping step 5 [1 second]: STEP6 -> TAXONOMIC ASSIGNMENT: 06.lca.pl AVAILABLE (free) RAM memory: 0.00 Gb We will set the number of threads to -2 for this step Splitting Diamond file Starting multithread LCA in -2 threads Creating /mnt/e/1_clean_data/raw/chile/Aero_chile/results/06.Aero_chile.fun3.tax.wranks file (stuck at this step for almost 5 days) Died at scripts/SqueezeMeta.pl line 941. (had to kill the command at this step) Link to the syslog file - https://easyupload.io/oomvs2

fpusan commented 1 year ago

Hi again! From your paths I see that you are using WSL2. See the discussion in #695. You should host your database and project in the Linux FS, not in the Windows one (in your case drive E:). Even if that drive is available to you when within WSL2, IO performance may be very low.

fpusan commented 1 year ago

Closing due to lack of activity, but see solution in #695