icbi-lab / nextNEOpi

nextNEOpi: a comprehensive pipeline for computational neoantigen prediction
Other
67 stars 24 forks source link

Error at install_IEDB process #4

Closed kevinpryan closed 2 years ago

kevinpryan commented 2 years ago

Hi there,

I’ve been attempting to run nextNEOpi on some test data for a while now but have been running into various issues. I am trying to run it both using a HPC and a cloud service but at the moment I am having problems with the install_IEDB process. Installation of MHCII seems to be failing in both cases - I have attached the output from both scenarios.

nextneopi_cloud.txt nextneopi_hpc.txt

The errors are quite similar, but on the HPC, there also seems to be a permission error. I get a similar error when I try to run it with conda (instead of singularity) on the HPC.

Some things I have tried are: installing tcsh and perl-ENV (as per https://downloads.iedb.org/tools/mhcii/3.1.6/README), using different versions of python (3.6,3.8), installing an older version of mhcii (3.1.5). Any help would be appreciated, and let me know if you need more information.

Operating system: CentOS7

riederd commented 2 years ago

Hi,

can it be that on the HPC system you have an existing/opt/mhcflurry_data/2.0.0 or /opt/mhcflurry_data/ independently of nextNEOpi and that /opt gets out bound in your system?

In the cloud env it might be the same issue, namely that /opt from the system gets auto bound into the singularity container. So it will hide the /opt directory from within the container where the correct python prerequisites and conda envs are installed.

Can you check your the default bind paths and mounts on both systems? See also: https://sylabs.io/guides/3.1/user-guide/bind_paths_and_mounts.html

HTH

kevinpryan commented 2 years ago

Hi, Thanks for your response. Neither system has a directory called /opt/mhcflurry_data/2.0.0 or /opt/mhcflurry_data/. Looking at the singularity config file on both systems under # BIND PATH, /opt is not included in either file.

riederd commented 2 years ago

Can you run the following for me and post the output:

$ singularity run https://apps-01.i-med.ac.at/images/singularity/pVACtools_3.0.0_icbi_5dfca363.sif /bin/bash
Singularity> find  /opt -maxdepth 2 -type d

The output should look like the following:

/opt
/opt/conda
/opt/conda/.empty
/opt/conda/bin
/opt/conda/compiler_compat
/opt/conda/conda-meta
/opt/conda/condabin
/opt/conda/envs
/opt/conda/etc
/opt/conda/include
/opt/conda/info
/opt/conda/lib
/opt/conda/libexec
/opt/conda/man
/opt/conda/pkgs
/opt/conda/sbin
/opt/conda/share
/opt/conda/shell
/opt/conda/ssl
/opt/conda/x86_64-conda_cos6-linux-gnu
/opt/iedb
/opt/mhcflurry_data
/opt/tmp_src

And this command in the nextNEOpi install dir:

$ grep -A 2 "withLabel:pVACtools" conf/process.config 

expected output:

    withLabel:pVACtools {
        container = 'https://apps-01.i-med.ac.at/images/singularity/pVACtools_3.0.0_icbi_5dfca363.sif'
    }

Thanks

kevinpryan commented 2 years ago

Output on cloud:

Test 1:

/opt/conda
/opt/conda/.empty
/opt/conda/bin
/opt/conda/compiler_compat
/opt/conda/conda-meta
/opt/conda/condabin
/opt/conda/envs
/opt/conda/etc
/opt/conda/include
/opt/conda/info
/opt/conda/lib
/opt/conda/libexec
/opt/conda/man
/opt/conda/pkgs
/opt/conda/sbin
/opt/conda/share
/opt/conda/shell
/opt/conda/ssl
/opt/conda/x86_64-conda_cos6-linux-gnu
/opt/iedb
/opt/mhcflurry_data
/opt/tmp_src

Test 2:

withLabel:pVACtools {
        container = 'https://apps-01.i-med.ac.at/images/singularity/pVACtools_3.0.0_icbi_5dfca363.sif'
    }

Output on HPC:

Test 1:

I'm getting very strange behaviour when I run this. I have contacted my sysadmin to see if he can figure out what is going on:

bash: module: command not found
bash: module: command not found
bash: module: command not found
bash: /home/mscstudent/anaconda3/bin/activate: No such file or directory

In addition, this essentially breaks my session and I have to log out and back in again, e.g.

module av
bash: module: command not found

Test 2 works fine:

withLabel:pVACtools {
        container = 'https://apps-01.i-med.ac.at/images/singularity/pVACtools_3.0.0_icbi_5dfca363.sif'
    }
riederd commented 2 years ago

I think this is because your $HOME is automounted into the container.

Can you try:


$ singularity run --no-home https://apps-01.i-med.ac.at/images/singularity/pVACtools_3.0.0_icbi_5dfca363.sif /bin/bash
Singularity> find  /opt -maxdepth 2 -type d```
riederd commented 2 years ago

one more thing: Can you also send me the files named .command.run and .command.sh from the work directories where the processes failed (the location should be indicated in the error message you get on the screen)

kevinpryan commented 2 years ago

That worked thanks, here is the output for Test 1 on HPC:

/opt
/opt/conda
/opt/conda/.empty
/opt/conda/bin
/opt/conda/compiler_compat
/opt/conda/conda-meta
/opt/conda/condabin
/opt/conda/envs
/opt/conda/etc
/opt/conda/include
/opt/conda/info
/opt/conda/lib
/opt/conda/libexec
/opt/conda/man
/opt/conda/pkgs
/opt/conda/sbin
/opt/conda/share
/opt/conda/shell
/opt/conda/ssl
/opt/conda/x86_64-conda_cos6-linux-gnu
/opt/iedb
/opt/mhcflurry_data
/opt/tmp_src

Here are the .command.run and .command.sh files for the install_IEDB process on the cloud system. I reran on the HPC and the install_IEDB process actually completed successfully - I have attached the relevant files from that run. cloud.command.run.txt cloud.command.sh.txt

hpc_successful.command.run.txt hpc_successful.command.sh.txt

However, I got another error in the run_hla_hd process, I have attached the output from that run and the relevant work files here: hpc_hla_hd_error.out.txt hpc_hlahd.command.out.txt hpc_hlahd.command.sh.txt

riederd commented 2 years ago

There should be a file named pm_extract in /data/kryan/sw/hlahd.1.4.0/bin/ Can you find it? Can you post the output of ls -la /data/kryan/sw/hlahd.1.4.0/bin/ and the content of /data/kryan/neoant/nextNEOpi/work/65/119a7312ab5b90b653dd08232f5198/.command.run

Thanks

kevinpryan commented 2 years ago

Looks like pm_extract is in /data/kryan/sw/hlahd.1.4.0/src:

ls -la /data/kryan/sw/hlahd.1.4.0/bin/
total 296
drwxr-xr-x 2 kryan kryan     78 May  9 19:48 .
drwxrwxrwx 8 kryan kryan   4096 May  9 18:36 ..
-rwxr-xr-x 1 kryan kryan 286716 Mar 30  2021 hla_estimation
-rwxr--r-- 1 kryan kryan    176 Mar 30  2021 ._hlahd.sh
-rwxr--r-- 1 kryan kryan   6973 Mar 30  2021 hlahd.sh

ls -la /data/kryan/sw/hlahd.1.4.0/src
total 344
drwxr-xr-x 2 kryan kryan  4096 May 10  2021 .
drwxrwxrwx 8 kryan kryan  4096 May  9 18:36 ..
-rw-r--r-- 1 kryan kryan   687 Mar 30  2021 CFASTA_define.h
-rw-r--r-- 1 kryan kryan   176 Mar 30  2021 ._CFASTA_Tools.h
-rw-r--r-- 1 kryan kryan  3496 Mar 30  2021 CFASTA_Tools.h
-rw-r--r-- 1 kryan kryan 30647 Mar 30  2021 Count.h
-rw-r--r-- 1 kryan kryan   176 Mar 30  2021 ._Create_fasta_from_dat.cpp
-rw-r--r-- 1 kryan kryan 27799 Mar 30  2021 Create_fasta_from_dat.cpp
-rw-r--r-- 1 kryan kryan  2188 Mar 31  2021 Define.h
-rw-r--r-- 1 kryan kryan   410 Mar 30  2021 Estimate.h
-rw-r--r-- 1 kryan kryan  2875 Mar 30  2021 get_diff_fasta.cpp
-rw-r--r-- 1 kryan kryan  9125 Mar 31  2021 hla_estimation.cpp
-rw-r--r-- 1 kryan kryan 11345 Mar 30  2021 Mcompare.h
-rw-r--r-- 1 kryan kryan  5655 Mar 30  2021 pick_up_allele.cpp
-rw-r--r-- 1 kryan kryan  3555 Mar 30  2021 Plot.h
-rw-r--r-- 1 kryan kryan 21022 Mar 30  2021 pm_extract.cpp
-rw-r--r-- 1 kryan kryan  8405 Mar 30  2021 Rank.h
-rw-r--r-- 1 kryan kryan 58436 Mar 31  2021 Read.h
-rw-r--r-- 1 kryan kryan  1206 Mar 30  2021 Reselect.h
-rw------- 1 kryan kryan   118 Mar 30  2021 .Rhistory
-rw-r--r-- 1 kryan kryan 10697 Mar 30  2021 sam_to_fastq_reverse.cpp
-rw-r--r-- 1 kryan kryan 29385 Mar 30  2021 Select.h
-rw-r--r-- 1 kryan kryan  6579 Mar 31  2021 Sort.h
-rw-r--r-- 1 kryan kryan 10570 Mar 30  2021 split_PM_reads.cpp
-rw-r--r-- 1 kryan kryan  1442 Mar 30  2021 split_shell.cpp
-rw-r--r-- 1 kryan kryan   241 Mar 30  2021 STFR_define.h
-rw-r--r-- 1 kryan kryan  1275 Mar 30  2021 STFR_Tools.h
-rw-r--r-- 1 kryan kryan   176 Mar 30  2021 ._Tools.h
-rw-r--r-- 1 kryan kryan  4047 Mar 30  2021 Tools.h
-rw-r--r-- 1 kryan kryan 20755 Mar 30  2021 Tree.h

It's possible that there is an issue during installation using install.sh as the Readme.txt file is inaccurate in terms of the location of files:

2. Installation and updating of dictionary

Move to hla_estimation directory (same directory of this Readme.txt) and type the following command.

> sh install.sh

For me there is no directory called hla_estimation (but one called /data/kryan/sw/hlahd.1.4.0/estimation) and the Readme.txt file is located under /data/kryan/sw/hlahd.1.4.0.

Content of /data/kryan/neoant/nextNEOpi/work/65/119a7312ab5b90b653dd08232f5198/.command.run: hpc_hlahd.command.run.txt

riederd commented 2 years ago

This looks strange. I just tried to install hlahd-1.4.0 myself and did not encounter any issue:

$ tar -xf hlahd.1.4.0.tar.gz
$ cd hlahd.1.4.0
$ sh install.sh
$ ls -la bin 
total 804
drwxr-xr-x 2  502 games    191 May 16 08:48 .
drwxr-xr-x 8  502 games    315 Mar 30  2021 ..
-rwxr--r-- 1  502 games    176 Mar 30  2021 ._hlahd.sh
-rwxr-xr-x 1 root root   24608 May 16 08:48 get_diff_fasta
-rwxr-xr-x 1 root root  221336 May 16 08:48 hla_est
-rwxr-xr-x 1  502 games 286716 Mar 30  2021 hla_estimation
-rwxr--r-- 1  502 games   6973 Mar 30  2021 hlahd.sh
-rwxr-xr-x 1 root root   41744 May 16 08:48 pick_up_allele
-rwxr-xr-x 1 root root   78912 May 16 08:48 pm_extract
-rwxr-xr-x 1 root root   63432 May 16 08:48 split_pm_read
-rwxr-xr-x 1 root root   29520 May 16 08:48 split_shell
-rwxr-xr-x 1 root root   42104 May 16 08:48 stfr

Could your try to reinstall it? And maybe also try to run the example given in der hlahd Readme.txt ?

kevinpryan commented 2 years ago

Apologies for the delay in replying. I was trying to install hlahd-1.4.0 from a compute node but it lacked the required dependencies so it was not installing correctly. I installed it from the head node of the cluster and it worked fine. Thanks for all your help with this.

riederd commented 2 years ago

Good to know, that you managed to get it to run. I'll now close this issue, feel free to reopen it or open a new one in case you hit something else. Thanks for your feedback and interest in nexNEOpi.