humanlongevity / HLA

xHLA: Fast and accurate HLA typing from short read sequence data
Other
101 stars 52 forks source link

Fail to run with test.bam file #32

Open icymiktreize opened 6 years ago

icymiktreize commented 6 years ago

xhla_errormessage

I get an error when trying to run your test.bam (tests/test.bam) on Docker. It said: [E::hts_open_format] fail to open file 'tests/test.bam' samtools view: failed to open "tests/test.bam" for reading: No such file or directory

The script I ran is as below: docker run -v pwd:pwd -w pwd humanlongevity/hla --sample_id test --input_bam_path tests/test.bam --output_path test

You can see the whole stack trace in the attached file. Can you help to run the script as I am not familiar with Docker? Thank you very much.

tanghaibao commented 6 years ago

@icymiktreize Do you see the file tests/test.bam in your current folder - which is /h according to the screenshot?

yaruchen commented 6 years ago

@tanghaibao I get the same problem with icymiktreize . and I am sure that the bam file I used is in my current folder .so how can i fix the problem ? is there any other ways ?

yaruchen commented 6 years ago

I fix my problem, by BWA-mem against hg38 reference not hg19 reference ,then I got the HLA type . now, I want to ask if xHLA only support hg38 reference not hg19 reference ?

tanghaibao commented 6 years ago

@yaruchen Currently yes - and ideally for consistency, the hg38 reference should not contain ALTs or any HLA contigs. Otherwise, the reads may need to be realigned. See discussion here.

https://github.com/humanlongevity/HLA/wiki/BAMs-compatible-with-xHLA

rhdolin commented 3 years ago

I'm experiencing the same problem I think. I'm using a CRAM file from Nebula. I converted CRAM to BAM:

samtools view -b -T GCA_000001405.15_GRCh38_no_alt_analysis_set.fna.gz -o NB6TK328_chr6.bam NB6TK328.cram chr6

then indexed bam file:

samtools index NB6TK328_chr6.bam

Then, with all files in a single folder, tried this:

sudo docker run humanlongevity/hla --sample_id NB6TK328 --input_bam_path ~/Desktop/NB6TK328_chr6.bam --output_path temp

but get error message:

[16/Nov/2020 18:31:02] INFO - Xie Chao's HLA typing algorithm [16/Nov/2020 18:31:02] INFO - Sample_id: NB6TK328 Input file: /home/ubuntu/Desktop/NB6TK328_chr6.bam typer.sh parameters: DELETE=false FULL=false Extracting reads from S3 [E::hts_open_format] fail to open file '/home/ubuntu/Desktop/NB6TK328_chr6.bam' samtools view: failed to open "/home/ubuntu/Desktop/NB6TK328_chr6.bam" for reading: No such file or directory Traceback (most recent call last): File "/opt/bin/run.py", line 64, in check_call(bin_args) File "/usr/lib/python2.7/subprocess.py", line 540, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['/opt/bin/typer.sh', '/home/ubuntu/Desktop/NB6TK328_chr6.bam', 'NB6TK328']' returned non-zero exit status 1

Appreciate if anyone can tell me what I'm doing wrong?

rhdolin commented 3 years ago

Quick follow up to my question above - the problem was that the docker container couldn't access files outside the container. I revised the syntax to this, which worked: sudo docker run -it -v ~/Desktop:/home -w /home humanlongevity/hla --sample_id NB6TK328 --input_bam_path NB6TK328_chr6.bam --output_path temp

dcweeks commented 1 year ago

I was able to generate BAM/index files and process them following @rhdolin's information above, and I was also able to run Nebula CRAM files directly. The following format simplified things for me and seemed to work well:

sudo docker run -it -v `pwd`:`pwd` -w `pwd` humanlongevity/hla --sample_id NG1092V0RF --input_bam_path NG1092V0RF.cram --output_path temp

From Docker command line docs: "The -v flag mounts the current working directory into the container. The -w lets the command being executed inside the current working directory, by changing into the directory to the value returned by pwd. So this combination executes the command using the container, but inside the current working directory."