Closed mawa86 closed 2 years ago
Hi @mawa86 -- are you sure the files aren't already compressed? You can check that using the check_compression
tool that comes as part of ont_fast5_api
. MinKNOW will use vbz compression by default, so it's entirely possible your files are already as small as they're going to get.
Thanks very much @fbrennen - you are indeed right. Should've checked myself but couldn't find it on MinKNOW official documentation, I read about it just now. Thank you again!
/Martin
Hi
I have a folder with ca. 1600 Fast5 files of average size 80 MB, totalling 130 GB circa. I've installed ont_fast5_api (after some difficulties with NumPy's wheels...) on my machine running ubuntu 18.04. Command I've used is:
compress_fast5 --input_path /path/to/my/fast5 --save_path /path/to/my/compressed_raw_reads --compression vbz --recursive --threads 24
The command seems to run fine..
| 1651 of 1651|##########################################################################|100% Time: 0:24:58
However the output folder contains ca. 1600 Fast5 files of average size 80 MB, totalling 130 GB circa. So exactly the same space. Am I doing something wrong? should I first make one file from many Fast5 files? will that cause any problem later should I need to re-basecall?
Thanks for any help
/Martin