Closed anshulbudhraja closed 8 months ago
Hi @anshulbudhraja ,
Thank you for your interest in BLAZE. Since you've successfully run BLAZE on the test dataset, the issues are less likely from the configuration of your environment. One potential problem could relate to your FASTQ file. Is it possible that your FASTQ file is incomplete, perhaps due to accidentally deleted lines? A good starting point for troubleshooting would be to check whether the number of lines in your FASTQ file is a multiple of 4.
Hi @youyupei , Thanks for the reply! My file shows:
$ wc -l 10H174_pass_final.fastq
435369176
I'm now trying with different fastq with fewer reads,
$ wc -l 10H174_pass_sub50M.fastq
200000000
and the progress is:
(05/03/2024 11:45:09) Getting putative barcodes from 1 FASTQ files...
Processed: 50000000Read [20:26, 40753.60Read/s]
Counting high-quality putative BC: 50it [01:11, 1.44s/it]
(05/03/2024 12:06:49) Getting barcode whitelist and empty droplet barcode list...
(05/03/2024 12:06:54) Creating emtpy droplets barocde list...
(05/03/2024 12:11:46) Assigning reads to whitelist.
Processed: 50000000Read [1:10:02, 11897.51Read/s]
(05/03/2024 13:21:48) Reads assignment completed. Demultiplexed read saved in blazeOutsub50Mmatched_reads.fastq.gz!
I believe you're right that the issue must be with the original fastq I was working with. Thank you for your time!
Hi! I am currently trying to run BLAZE, installed in virtualenv, in a cluster where I'm using SLURM to submit a job. The error I'm facing is:
Although
blaze --help
is running fine in the virtualenvTotal number of reads in the input file are: 108,842,294 My code/command was:
My virtualenv contains the following packages
Edit: I have successfully run the test (from the BLAZE/test) folder using the blaze command through the virtualenv. I launched the job through SLURM in the same manner but it failed on my file, although it succeeded with the test data.
Also, the job (using my data) had time allocation for 2 days but it failed in a few hours. Any help would be much appreciated!