Open phlatphish opened 2 years ago
Subsequently, bracken ran cleanly using that database on krakenuniq reports.
I'm using Krakenuniq with the pre-built database downloaded from https://benlangmead.github.io/aws-indexes/k2, the 384G one labelled as EuPathDB48, to generate the report file.
Bracken was installed by the install shell rather than via conda
when I ran the program, it said
bracken -d ~/krakenuniq -i new_report.tsv -o new_bracken -w new_bracken_report -r 50 -l S -t 0
Checking for Valid Options... Running Bracken python src/est_abundance.py -i new_report.tsv -o new_bracken -k /hdd1/home/f22_yfeng/krakenuniq/database50mers.kmer_distrib -l S -t 0 PROGRAM START TIME: 10-13-2022 11:57:37 Checking report file: new_report.tsv Traceback (most recent call last): File "/hdd1/home/f22_yfeng/Bracken/src/est_abundance.py", line 554, in
main() File "/hdd1/home/f22_yfeng/Bracken/src/est_abundance.py", line 339, in main [mapped_taxid, mapped_taxid_dict] = process_kmer_distribution(line,lvl_taxids,map2lvl_taxids) File "/hdd1/home/f22_yfeng/Bracken/src/est_abundance.py", line 100, in process_kmer_distribution [g_taxid,mkmers,tkmers] = genome_str.split(':') ValueError: not enough values to unpack (expected 3, got 1)
I'm wondering how can I fix this ?
Many thanks
@phlatphish Can you try running without the -x flag?
Otherwise, i did fix the script. I accidentally left kraken2 as the default - ignoring the -y flag - when specifying -x. But the newest version does fix this
@fengyuchengdu can you open a new issue? I think your kmer_distribution file might be wrong. I need to see the kmer_distribution file you downloaded/are you using
There is still something funny going on. I have the latest bracken build in ~/build/Bracken and I have the executables as symlinks in ~/bin. They are in the path. For krakenuniq I am using a conda package version 0.6. When I had kraken and kraken2 installed with conda alongside krakenuniq I got this error:
When I uninstalled kraken and kraken2 leaving only krakenuniq, bracken-build started working:
That's as far as I have gotten so far.