Hi,
I'm using shogun filter to decontaminate reads that map to a human genome. I encounter a problem which is an incompatible DB. I found the workaround which is a removal of reads that are longer than 254 bases. After that shogun filter finishes correctly and removes some of reads that were associated with human genome. However, I loose quite a lot of reads when taking only those shorter than 254 bases. Is there any other way to avoid DB incompatibility issue? The shogun filter option uses burst and two databases humanD252.acx and humanD252.edx.
Best,
Mariusz
Hi, I'm using shogun filter to decontaminate reads that map to a human genome. I encounter a problem which is an incompatible DB. I found the workaround which is a removal of reads that are longer than 254 bases. After that shogun filter finishes correctly and removes some of reads that were associated with human genome. However, I loose quite a lot of reads when taking only those shorter than 254 bases. Is there any other way to avoid DB incompatibility issue? The shogun filter option uses burst and two databases humanD252.acx and humanD252.edx. Best, Mariusz