Closed ChelseaCHENX closed 5 years ago
Dear Chelsea,
Thank you for using XenofilteR. I will try to help the best I can but I will need a little more information.
Problem 1 looks indeed like a memory problem. How many reads do you have in your samples? Samples with a lot of sequence reads and many mouse reads take a lot of memory. To test if it really is a memory problem you can select a random set of genes from your Bam file to test with. (Example: samtools view -s 0.10 in.bam > out.bam).
Problem 2 I have never encountered before. XenofilteR assigns all sequence reads to mouse and none to human. Even if it was all mouse still a couple of sequence reads would map better to the human reference genome. So clearly something else is happening. Did you check the mapping to human? No strange things in the bam file or anything else? Would it be possible to share a file with me so I can run it and see if I can find what goes wrong? A single bam and only a subset of reads will do I think (~10.000).
Sorry I do not have the solution yet but we should be able to figure this out together.
Kind regards, Oscar
Oscar Krijgsman, PhD Postdoctoral research fellow Division of Molecular Oncology & Immunology The Netherlands Cancer Institute Plesmanlaan 121 1066 CX Amsterdam, the Netherlands Phone: +31 20 512 2028
On 30 Jun 2019, at 18:51, Chelsea Chen notifications@github.com<mailto:notifications@github.com> wrote:
Hello! I was having some problems when running xenofilteR - hope you can help~
Problem 1
When I am running multiple sample pairs (human and mouse.bam) by slurm, it always ends up with this error in standard error output
Error: 'bplapply' receive data failed: error reading from connection In addition: Warning message: In doTryCatch(return(expr), name, parentenv, handler) : [bam_header_read] EOF marker is absent. The input is probably truncated. Execution halted
The XenofilteR.log looks fine, as it ends up with the one in process, eg,
INFO [2019-06-30 10:34:57] NT017.human.bam - Filtered 4080529 read pairs out of 4080529 - 100 Percent
while the NT017.human_Filtered.bam is empty.
I had suspected the ram problem - so I submit the job with sbatch --mem=64g, which works well in most occasions, but it fails with conditions described as above. Wonder how I am going to solve this...
problem2
For those files with valid record in XenofilteR.log, I found them with filtered bam and bai, but the size of all of them are very small, and the percent is always 100%. eg from log: INFO [2019-06-30 10:19:10] NT006.human.bam - Filtered 424959 read pairs out of 424959 - 100 Percent
-rw-rw----+ 1 fangyuan alee 3.0K Jun 30 10:19 NT006.human_Filtered.bam -rw-rw----+ 1 fangyuan alee 1.6K Jun 30 10:19 NT006.human_Filtered.bam.bai
I wonder if I had lost them all or none - in either case it is not quite as expected - wonder if you could help - or propose possible reasons for this outcome.
Appreciate your help and look forward to your reply : )
Chelsea
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHubhttps://github.com/PeeperLab/XenofilteR/issues/8?email_source=notifications&email_token=AB7X5RGDIEQJIRO3UIQYJX3P5DQADA5CNFSM4H4M7FRKYY3PNVWWK3TUL52HS4DFUVEXG43VMWVGG33NNVSW45C7NFSM4G4QBIMA, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AB7X5RDWIGJOG62ONLZBDGDP5DQADANCNFSM4H4M7FRA.
Hello- Thanks for this program! I ran into the same problem as ChelseaCHENX, with 100% of human reads filtered out. Has this issue been resolved? If so, can you kindly point me to a solution, please? Thank you! Melody
Hello! I was having some problems when running xenofilteR - hope you can help~
Problem 1
When I am running multiple sample pairs (human and mouse.bam) by slurm, it always ends up with this error in standard error output
The
XenofilteR.log
looks fine, as it ends up with the one in process, eg,while the
NT017.human_Filtered.bam
is empty.I had suspected the ram problem - so I submit the job with sbatch --mem=64g, which works well in most occasions, but it fails with conditions described as above. Wonder how I am going to solve this...
problem2
For those files with valid record in
XenofilteR.log
, I found them with filtered bam and bai, but the size of all of them are very small, and the percent is always 100%. eg from log:INFO [2019-06-30 10:19:10] NT006.human.bam - Filtered 424959 read pairs out of 424959 - 100 Percent
I wonder if I had lost them all or none - in either case it is not quite as expected - wonder if you could help - or propose possible reasons for this outcome.
Appreciate your help and look forward to your reply : )
Chelsea