Mikado is a lightweight Python3 pipeline whose purpose is to facilitate the identification of expressed loci from RNA-Seq data * and to select the best models in each locus.
I'm using mikado version 2.cr6. When I run mikado prepare last week, it works without any error. I forget whether I reinstall mikado or not after that run. I tried same code for different file today, but it show such error:
2020-02-05 13:51:08,019 - prepare - prepare.py:518 - ERROR - prepare - MainProcess - unable to open database file
Traceback (most recent call last):
File "/work/LAS/mash-lab/jing/bin/mikado/Mikado/preparation/prepare.py", line 488, in prepare
max_intron=args.json_conf["prepare"]["max_intron_length"],)
File "/work/LAS/mash-lab/jing/bin/mikado/Mikado/preparation/prepare.py", line 393, in load_exon_lines
_load_exon_lines_multi(args, shelve_names, logger, min_length, strip_cds, threads, max_intron)
File "/work/LAS/mash-lab/jing/bin/mikado/Mikado/preparation/prepare.py", line 348, in _load_exon_lines_multi
check_same_thread=False # Necessary for SQLite3 to function in multiprocessing
sqlite3.OperationalError: unable to open database file
2020-02-05 13:51:08,323 - prepare - prepare.py:520 - ERROR - prepare - MainProcess - Mikado has encountered an error, exiting
2020-02-05 13:51:08,422 - prepare - prepare.py:533 - INFO - prepare - MainProcess - Mikado prepare has finished correctly. The output mikado_out/mikado_prepared.fasta FASTA file can now be used for BLASTX and/or ORF calling before the next step in the pipeline, `mikado serialise`.
I already install mikado again, but still have same error.
Hi @lijing28101, thank you for reporting. I am revising this issue already in these days, hopefully I will have a fix soon.
Thank you for your patience and for reporting.
Hi,
I'm using mikado version 2.cr6. When I run mikado prepare last week, it works without any error. I forget whether I reinstall mikado or not after that run. I tried same code for different file today, but it show such error:
I already install mikado again, but still have same error.