Closed boyanboyue closed 1 month ago
I do not believe we have a method for running kraken2 with limited RAM with a database file that is that much bigger. I think you will have to use one of the provided minikraken databases: https://benlangmead.github.io/aws-indexes/k2
You can run KrakenUniq instead using any size DB you want, with a --preload-size option. We published this in bioRxiv and an open CS journal: https://joss.theoj.org/papers/10.21105/joss.04908 However note that KrakenUniq uses a different DB format from Kraken2. We have compatible databases on the AWS site linked by Jen above. I've run it successfully with a 430GB database on my 16GB RAM laptop.
OK,I will have a try.Thanks a lot for all your reply.
HI,I want to run an analyse by kraken2 with prepared nt Database index files.The haxi file is about 704Gb. But I can't run it with 140G RAM machine,I know it's not enough. I have known that
--memory-mapping
option maybe helpfull,but my /dev/shm dictionary is also not enough. Can anyone tell me any methods to run the analyse with least RAM and how much hardware resources it may be used. Please give me some suggestions about using kraken2 with nt database to analyse.