I'm encountering an issue while using fastqwiper to process a damaged fastq.gz file. Here are the details of the problem:
I am attempting to process a fastq.gz file that has been damaged due to bad sectors on the hard drive. Previous attempts to directly decompress the file or filter bad reads using other tools have failed. fastqwiper seemed promising until it encountered this error after processing approximately 534,500,000 reads.
Traceback (most recent call last):
File "/home/godusevpn/miniconda310/envs/fastqwiper/bin/fastqwiper", line 11, in <module>
sys.exit(main())
File "/home/godusevpn/miniconda310/envs/fastqwiper/lib/python3.11/site-packages/fastq_wiper/wiper.py", line 317, in main
wipe_fastq(args.fastq_in, args.fastq_out, args.log_out, args.log_frequency, args.alphabet)
File "/home/godusevpn/miniconda310/envs/fastqwiper/lib/python3.11/site-packages/fastq_wiper/wiper.py", line 255, in wipe_fastq
for line in fin:
File "/home/godusevpn/miniconda310/envs/fastqwiper/lib/python3.11/gzip.py", line 314, in read1
return self._buffer.read1(size)
File "/home/godusevpn/miniconda310/envs/fastqwiper/lib/python3.11/_compression.py", line 68, in readinto
data = self.read(len(byte_view))
File "/home/godusevpn/miniconda310/envs/fastqwiper/lib/python3.11/gzip.py", line 507, in read
uncompress = self._decompressor.decompress(buf, size)
zlib.error: Error -3 while decompressing data: invalid distance too far back
Could you please advise on how to handle this error? Is there a workaround or parameter adjustment that might help in processing such a damaged file with fastqwiper? Any guidance or insights would be greatly appreciated.
Hello fastqwiper maintainers,
I'm encountering an issue while using fastqwiper to process a damaged fastq.gz file. Here are the details of the problem:
I am attempting to process a fastq.gz file that has been damaged due to bad sectors on the hard drive. Previous attempts to directly decompress the file or filter bad reads using other tools have failed. fastqwiper seemed promising until it encountered this error after processing approximately 534,500,000 reads.
fastqwiper -i TT_A12_TdLN6_S1_L001_R2_001.fastq.gz -o TT_A12_TdLN6_S1_L001_R2_001_fastqwiper.fastq.gz
Could you please advise on how to handle this error? Is there a workaround or parameter adjustment that might help in processing such a damaged file with fastqwiper? Any guidance or insights would be greatly appreciated.
Additional Information: Python Version: 3.11.9 fastqwiper Version: 2024.1.93
Best regards, Xiangyang