aghozlane / masque

Metagenomic AnalySis with QUantitative pipEline
GNU Lesser General Public License v3.0
12 stars 4 forks source link

extendedFrags_fastqc.html" does not exist or is empty #1

Open dridk opened 7 years ago

dridk commented 7 years ago

Script launch failed. My data are paired end 16S data ( illumina V3-V5)

myData/ 
1001_R1.fastq  1082_R2.fastq  1222_R1.fastq  2006_R2.fastq  2060_R1.fastq  2120_R2.fastq  2178_R1.fastq  2309_R2.fastq  2375_R1.fastq  3025_R2.fastq  3089_R1.fastq  3143_R2.fastq  3209_R1.fastq
1001_R2.fastq  1086_R1.fastq  1222_R2.fastq  2007_R1.fastq  2060_R2.fastq  2122_R1.fastq  2178_R2.fastq  2311_R1.fastq  2375_R2.fastq  3026_R1.fastq  3089_R2.fastq  3145_R1.fastq  3209_R2.fastq
1003_R1.fastq  1086_R2.fastq  1230_R1.fastq  2007_R2.fastq  2061_R1.fastq  2122_R2.fastq  2179_R1.fastq  2311_R2.fastq  2378_R1.fastq  3026_R2.fastq  3092_R1.fastq  3145_R2.fastq  3210_R1.fastq
1003_R2.fastq 

After running :

Minimum read length [--minreadlength]= 35
Minimum phred quality [--minphred]= 20
Minimum allowed percentage of correctly called nucleotides [--minphredperc]= 80
Minimum number of mistach for the filtering [--NbMismatchMapping]= 1
Filtering databases= human phi
# OTU process:
Dereplication is in full length mode
Minimum length of an amplicon [--minampliconlength]= 64
Minimum size of an OTU for singleton removal [--minotusize]= 4
Chimera filtering is in de novo mode
Clustering is performed with vsearch
# 16S/18S annotation
Identity threshold with vsearch [--identityThreshold]= 0.75
Conserved position for alignment[--conservedPosition]= 0.5
Tree generated in fast mode with FastTree
* Start analysis
* Start working on reads
* 1/188 - Quality control with Fastqc
* File "/mydata/result//reads//1001.extendedFrags_fastqc.html" does not exist or is empty !
aghozlane commented 7 years ago

Are you using the docker version ?

dridk commented 7 years ago

Yes, ofcourse. It seems my files are empty.. After checking the /mydata/result/log/log_mapping_1001_1.txt

Warning: Could not open read file "/mydata/result/reads//filter_human_0/un-conc-mate.1" for reading; skipping...
Error: No input read files were valid
(ERR): bowtie2-align exited with value 1
root@a9bcd2f0c30f:/mydata/result/log# ls
log_flash_1001.txt  log_mapping_1001_1.txt
root@a9bcd2f0c30f:/mydata/result/log# ls
log_flash_1001.txt  log_mapping_1001_1.txt
dridk commented 7 years ago

How can see which command is run ? I can try to start it alone to see if it works

aghozlane commented 7 years ago

Inside the docker: vi /usr/local/bin/masque -> line 726 I'm trying to reproduce the bug.

aghozlane commented 7 years ago

Does one of your file is empty ? Can you give me the command line that you ran ?

dridk commented 7 years ago

I m going to give you the first fastq pair

dridk commented 7 years ago

Here are the first fastq pairs : http://dl.free.fr/lpYtxessT I thinks the first step of the pipeline returns me an empty file.

dridk commented 7 years ago

I just run :

    masque -i /mydata/ -o /mydata/result/
dridk commented 7 years ago

I just ran the first commands by myself and it works fine. So, I guess the problem comes from masque. it doesn't start the command as expected.

  java -jar /usr/local/bin/AlienTrimmer_0.4.0/src/AlienTrimmer.jar -if /mydata/1001_R1.fastq -ir /mydata/1001_R2.fastq -of alien_f.fastq -or alien_r.fastq -os alien_s.fastq -c /usr/local/bin/databases/alienTrimmerPF8contaminants.fasta
aghozlane commented 7 years ago

Yes, I'm tracking the bug...

aghozlane commented 7 years ago

The bug comes from bowtie2: /usr/local/bin/bowtie2-2.2.9/bowtie2 -q -N 1 -p 8 -x /usr/local/bin/databases/homo_sapiens.fna -1 1001_R1.fastq -2 1001_R2.fastq -S /dev/null --un-conc test/reads//filter_human_0_test -t --very-fast Time loading reference: 00:00:00 Time loading forward index: 00:00:01 Time loading mirror index: 00:00:01 Error: reads file does not look like a FASTQ file terminate called after throwing an instance of 'int' Aborted (core dumped) (ERR): bowtie2-align exited with value 134

I will look on how to resolve this stuff

aghozlane commented 7 years ago

The docker is updated but for the moment, the bug is not fixed.