soedinglab / hh-suite

Remote protein homology detection suite.
https://bmcbioinformatics.biomedcentral.com/articles/10.1186/s12859-019-3019-7
GNU General Public License v3.0
515 stars 128 forks source link

Errors from hhsuitedb.py #327

Open rcedgar opened 1 year ago

rcedgar commented 1 year ago

I am trying to build a database on a Ubuntu server. I am using this release: hhsuite-3.3.0-AVX2-Linux.tar.gz.

The documentation and examples for hhsuitedb.py are rather sparse, but as best I can tell I need to create a directory with a2m alignments and run hhsuitedb.py with this directory as the current directory. There appears to be a dependency on mpirun, but after installing it I still get errors as described below.

I installed mpirunas follows:

sudo apt install openmpi-bin

Running in a different directory from the a2m files:

hhsuitedb.py  "--ia3m=../a2m/*.a2m" --cpu 1 -o xxx

I get one warning for every a2m file:

Warning: could not find '../a2m/iter94.a2m'

Then no database is created. The warnings are puzzling because the files do exist at the relative path shown in the warning. If I use a glob with the full path name instead of a relative path, I get the same warnings and same end result (no database).

Running in the same same directory as the a2m files I get this error:

hhsuitedb.py  "--ihhm=*.a2m" --cpu 1 -o xxx

beignet-opencl-icd: no supported GPU found, this is probably the wrong opencl-icd package for this hardware
(If you have multiple ICDs installed and OpenCL works, you can ignore this message)
--------------------------------------------------------------------------
mpirun was unable to find the specified executable file, and therefore
did not launch the job.  This error was first reported for process
rank 0; it may have occurred for other processes as well.

NOTE: A common cause for this error is misspelling a mpirun command
      line parameter option (remember that mpirun interprets the first
      unrecognized command line token as the executable).

Node:       i9
Executable: ffindex_apply_mpi
--------------------------------------------------------------------------
Traceback (most recent call last):
  File "/mnt/d/sw/hhsuite/scripts/rce_hhsuitedb.py", line 483, in 
    main()
  File "/mnt/d/sw/hhsuite/scripts/rce_hhsuitedb.py", line 479, in main
    check_database(options.output_basename, options.nr_cores, options.force_mode)
  File "/mnt/d/sw/hhsuite/scripts/rce_hhsuitedb.py", line 376, in check_database
    calculate_hhm(threads, output_basename+"_a3m", output_basename+"_hhm")
  File "/mnt/d/sw/hhsuite/scripts/rce_hhsuitedb.py", line 106, in calculate_hhm
    check_call(" ".join(["mpirun", "-np", threads, "ffindex_apply_mpi", a3m_base_path+".ffdata", large_a3m_index, "-d", hhm_base_path+".ffdata", "-i", hhm_base_path+".ffindex", "--", "hhmake", "-v", str(0), "-i", "stdin", "-o" ,"stdout"]), shell=True)
  File "/home/bob/miniconda3/lib/python3.9/subprocess.py", line 373, in check_call
    raise CalledProcessError(retcode, cmd)
subprocess.CalledProcessError: Command 'mpirun -np 10 ffindex_apply_mpi xxx_a3m.ffdata /tmp/tmpkjzhrynd/large.ffindex -d xxx_hhm.ffdata -i xxx_hhm.ffindex -- hhmake -v 0 -i stdin -o stdout' returned non-zero exit status 134.