Closed adriangeerre closed 1 year ago
On our HPC cluster, it works:
$ singularity --version
singularity version 3.2.0-1
$ singularity run scoary-2_latest.sif scoary2 --help
INFO: Showing help with the command 'scoary2 -- --help'.
NAME
scoary2 - Scoary2: Associate genes with traits!
(...)
and
$ singularity --version
apptainer version 1.1.8-1.el7
$ singularity run scoary-2_latest.sif scoary2 --help
INFO: Showing help with the command 'scoary2 -- --help'.
NAME
scoary2 - Scoary2: Associate genes with traits!
(...)
I'm not sure what went wrong... What version of Singularity are you using?
Hi again,
Thanks for the quick reply. Here is my version:
$ singularity --version
singularity version 3.8.5-2.el7
and here is the image size (I downloaded it again, same issue):
-rwxrwxr-x 1 adrian adrian 318742528 May 22 13:15 scoary-2_latest.sif
So it's unlikely to be a Singularity problem...
I don't understand why that's happening. The container contains the gcc
compiler, it's included in build-essential
, see Dockerfile.
I wonder if your CPU is not compatible with mine and it needs to recompile something. What do you get from python -c "import numpy;numpy.show_config()"
?
Maybe openblas is missing... Can you test if the image resulting from this Dockerfile works?
FROM troder/scoary-2
RUN apt-get update && \
apt-get install -y libopenblas*
I have access the image and obtain the output you asked me for:
$ singularity run scoary-2_latest.sif
Python 3.10.7 (main, Sep 13 2022, 01:42:44) [GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import numpy
>>> numpy.show_config()
openblas64__info:
libraries = ['openblas64_', 'openblas64_']
library_dirs = ['/usr/local/lib']
language = c
define_macros = [('HAVE_CBLAS', None), ('BLAS_SYMBOL_SUFFIX', '64_'), ('HAVE_BLAS_ILP64', None)]
runtime_library_dirs = ['/usr/local/lib']
blas_ilp64_opt_info:
libraries = ['openblas64_', 'openblas64_']
library_dirs = ['/usr/local/lib']
language = c
define_macros = [('HAVE_CBLAS', None), ('BLAS_SYMBOL_SUFFIX', '64_'), ('HAVE_BLAS_ILP64', None)]
runtime_library_dirs = ['/usr/local/lib']
openblas64__lapack_info:
libraries = ['openblas64_', 'openblas64_']
library_dirs = ['/usr/local/lib']
language = c
define_macros = [('HAVE_CBLAS', None), ('BLAS_SYMBOL_SUFFIX', '64_'), ('HAVE_BLAS_ILP64', None), ('HAVE_LAPACKE', None)]
runtime_library_dirs = ['/usr/local/lib']
lapack_ilp64_opt_info:
libraries = ['openblas64_', 'openblas64_']
library_dirs = ['/usr/local/lib']
language = c
define_macros = [('HAVE_CBLAS', None), ('BLAS_SYMBOL_SUFFIX', '64_'), ('HAVE_BLAS_ILP64', None), ('HAVE_LAPACKE', None)]
runtime_library_dirs = ['/usr/local/lib']
Supported SIMD extensions in this NumPy install:
baseline = SSE,SSE2,SSE3
found = SSSE3,SSE41,POPCNT,SSE42,AVX,F16C,FMA3,AVX2,AVX512F,AVX512CD,AVX512_SKX
not found = AVX512_KNL,AVX512_KNM,AVX512_CLX,AVX512_CNL,AVX512_ICL
Do you get the same error with the new image?
I just found another possible missing library: llvm.
Try the image from this Dockerfile:
FROM troder/scoary-2
RUN apt-get update && \
apt-get install -y libopenblas*
RUN apt-get install -y llvm-13 llvm-13-dev
ENV RUN LLVM_CONFIG=/usr/bin/llvm-config-13
I'm not sure this works. I don't know what else to try, it sucks that I can't reproduce the error.
I was having a look at it, I am new to the creation of containers/images. I tried creating the image but I do not have root access and the --remote/--fakeroot did not work for me.
$ vim scoary-2_latest_testing.def
$ singularity build scoary-2_latest_testing.sif scoary-2_latest_testing.def
FATAL: You must be the root user, however you can use --remote or --fakeroot to build from a Singularity recipe file
$ singularity build --remote scoary-2_latest_testing.sif scoary-2_latest_testing.def
FATAL: Unable to submit build job: no authentication token, log in with `singularity remote login`
$ singularity build --fakeroot scoary-2_latest_testing.sif scoary-2_latest_testing.def
FATAL: could not use fakeroot: no mapping entry found in /etc/subuid for user
Do you know another way of creating or modifying a image? Thank you for the patience
No worries!
I also cannot make Singularity images on the HPC because I don't have root access. I normally use a computer with both docker
and singularity
installed:
1) Create the new Docker image based on the Dockerfile: docker build . --tag scoary2_testing
2) Convert to a singularity image: singularity build scoary2_testing.sif docker-daemon://scoary2_testing:latest
3) Rsync the image to the HPC.
Hi again,
I have built the last image you asked me to, and I obtained a tar file from it. Then, I uploaded the tar file to the HPC environment and built the sif image for singularity. However, I am still facing the same issue.
$ vim Dockerfile # Here I added the definition provided by you
$ docker build . --tag scoary2_testing
[+] Building 27.9s (3/6)
=> [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s
=> [internal] load build definition from Dockerfile 0.0s
=> => transferring dockerfile: 209B 0.0s
=> [internal] load metadata for docker.io/troder/scoary-2:latest 2.0s
=> [1/3] FROM docker.io/troder/scoary-2@sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c9132c4f6 25.9s
=> => resolve docker.io/troder/scoary-2@sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c9132c4f6 0.0s
=> => sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c9132c4f6 1.73kB / 1.73kB 0.0s
=> => sha256:fccd2c046d99e8f0af41d0d01f789b580936dd078f63841769650bae87b08507 8.09kB / 8.09kB 0.0s
=> => sha256:31b3f1ad4ce1f369084d0f959813c51df0ca17d9877d5ee88c2db6ff88341430 31.40MB / 31.40MB 5.4s
=> => sha256:f335cc1597f2f2d13ceea1c9b386aa1ac28efc46906a0a9cf1b4e368ec33a62a 1.08MB / 1.08MB 1.6s
=> => sha256:501b4d0d8bea6f36e0132899acecf241b4fc7f91117be870fee91069d93f2384 12.10MB / 12.10MB 1.8s
[+] Building 28.0s (3/6) => => sha256:9358bdbbffdc0f5b9e4cdb442d5676d421068183f5558979a2450ef59af16e4f 3.34MB / 3.34MB => [internal] load .dockerignore 0.0s => => transferring context: 2B 0.0s => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 209B 0.0s => [internal] load metadata for docker.io/troder/scoary-2:latest 2.0s => [1/3] FROM docker.io/troder/scoary-2@sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c9132c4f6 26.0s => => resolve docker.io/troder/scoary-2@sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c9132c4f6 0.0s => => sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c9132c4f6 1.73kB / 1.73kB 0.0s => => sha256:fccd2c046d99e8f0af41d0d01f789b580936dd078f63841769650bae87b08507 8.09kB / 8.09kB 0.0s => => sha256:31b3f1ad4ce1f369084d0f959813c51df0ca17d9877d5ee88c2db6ff88341430 31.40MB / 31.40MB 5.4s => => sha256:f335cc1597f2f2d13ceea1c9b386aa1ac28efc46906a0a9cf1b4e368ec33a62a 1.08MB / 1.08MB 1.6s => => sha256:501b4d0d8bea6f36e0132899acecf241b4fc7f91117be870fee91069d93f2384 12.10MB / 12.10MB 1.8s
[+] Building 28.1s (3/6) => => sha256:9358bdbbffdc0f5b9e4cdb442d5676d421068183f5558979a2450ef59af16e4f 3.34MB / 3.34MB => [internal] load .dockerignore 0.0s => => transferring context: 2B 0.0s => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 209B 0.0s => [internal] load metadata for docker.io/troder/scoary-2:latest 2.0s => [1/3] FROM docker.io/troder/scoary-2@sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c9132c4f6 26.1s => => resolve docker.io/troder/scoary-2@sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c9132c4f6 0.0s => => sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c9132c4f6 1.73kB / 1.73kB 0.0s => => sha256:fccd2c046d99e8f0af41d0d01f789b580936dd078f63841769650bae87b08507 8.09kB / 8.09kB 0.0s => => sha256:31b3f1ad4ce1f369084d0f959813c51df0ca17d9877d5ee88c2db6ff88341430 31.40MB / 31.40MB 5.4s => => sha256:f335cc1597f2f2d13ceea1c9b386aa1ac28efc46906a0a9cf1b4e368ec33a62a 1.08MB / 1.08MB 1.6s => => sha256:501b4d0d8bea6f36e0132899acecf241b4fc7f91117be870fee91069d93f2384 12.10MB / 12.10MB 1.8s
=> => sha256:abd735557fdf93d385293000b9bb82151a0de2c674ff621a56484ea49bc7de4c 232B / 232B 1.9s
=> => sha256:9358bdbbffdc0f5b9e4cdb442d5676d421068183f5558979a2450ef59af16e4f 3.34MB / 3.34MB 3.5s
=> => sha256:27f95d985cf0387a3ca0f2c1006b67e291b0571351037fd3425ddd87a7bfc666 112.68MB / 112.68MB 19.3s
=> => sha256:00dba61931de4b5cafacdc9af33c876aad3903a4e321bdb4f73cebe36f32f5a8 103.69kB / 103.69kB 4.1s
=> => sha256:6f6ab1d53479161afda4ed0f31fee4ae752b23c861d57b1b83b31a17de33ea19 176.65MB / 176.65MB [+] Building 92.1s (7/7) FINISHED => [internal] load .dockerignore 0.0s
=> => transferring context: 2B 0.0s => [internal] load build definition from Dockerfile 0.0s. => => transferring dockerfile: 209B 0.0s
=> [internal] load metadata for docker.io/troder/scoary-2:latest 2.0s => [1/3] FROM docker.io/troder/scoary-2@sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c91 34.1s. => => resolve docker.io/troder/scoary-2@sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c913 0.0s
=> => sha256:691f961ec1f03307a4ba0366f5a692d0ab834d63f27f7991e1f3de4c9132c4f6 1.73kB / 1.73kB 0.0s => => sha256:fccd2c046d99e8f0af41d0d01f789b580936dd078f63841769650bae87b08507 8.09kB / 8.09kB 0.0s. => => sha256:31b3f1ad4ce1f369084d0f959813c51df0ca17d9877d5ee88c2db6ff88341430 31.40MB / 31.40MB 5.4s
=> => sha256:f335cc1597f2f2d13ceea1c9b386aa1ac28efc46906a0a9cf1b4e368ec33a62a 1.08MB / 1.08MB 1.6s => => sha256:501b4d0d8bea6f36e0132899acecf241b4fc7f91117be870fee91069d93f2384 12.10MB / 12.10MB 1.8s. => => sha256:abd735557fdf93d385293000b9bb82151a0de2c674ff621a56484ea49bc7de4c 232B / 232B 1.9s
=> => sha256:9358bdbbffdc0f5b9e4cdb442d5676d421068183f5558979a2450ef59af16e4f 3.34MB / 3.34MB 3.5s => => sha256:27f95d985cf0387a3ca0f2c1006b67e291b0571351037fd3425ddd87a7bfc666 112.68MB / 112.68MB 19.3s. => => sha256:00dba61931de4b5cafacdc9af33c876aad3903a4e321bdb4f73cebe36f32f5a8 103.69kB / 103.69kB 4.1s
=> => sha256:6f6ab1d53479161afda4ed0f31fee4ae752b23c861d57b1b83b31a17de33ea19 176.65MB / 176.65MB 25.2s => => extracting sha256:31b3f1ad4ce1f369084d0f959813c51df0ca17d9877d5ee88c2db6ff88341430 1.4s. => => extracting sha256:f335cc1597f2f2d13ceea1c9b386aa1ac28efc46906a0a9cf1b4e368ec33a62a 0.1s
=> => extracting sha256:501b4d0d8bea6f36e0132899acecf241b4fc7f91117be870fee91069d93f2384 0.4s => => extracting sha256:abd735557fdf93d385293000b9bb82151a0de2c674ff621a56484ea49bc7de4c 0.0s. => => extracting sha256:9358bdbbffdc0f5b9e4cdb442d5676d421068183f5558979a2450ef59af16e4f 0.3s
=> => extracting sha256:27f95d985cf0387a3ca0f2c1006b67e291b0571351037fd3425ddd87a7bfc666 3.1s => => extracting sha256:00dba61931de4b5cafacdc9af33c876aad3903a4e321bdb4f73cebe36f32f5a8 0.0s. => => extracting sha256:6f6ab1d53479161afda4ed0f31fee4ae752b23c861d57b1b83b31a17de33ea19 8.5s
=> [2/3] RUN apt-get update && apt-get install -y libopenblas* 21.2s
=> [3/3] RUN apt-get install -y llvm-13 llvm-13-dev 29.8s
=> exporting to image 4.8s
=> => exporting layers 4.8s
=> => writing image sha256:f61b254de4dcd859f99406bb1b89aa39e7ef35960afc978ab908d8d7dd5ea0c6 0.0s
=> => naming to docker.io/library/scoary2_testing
$ docker save f61b254de4dc -o scoary2_testing.tar
# Upload to the HPC and run:
$ singularity build scoary2_testing.sif docker-archive://scoary2_testing.tar # The command you provided failed for me
$ singularity run scoary2_testing.sif scoary2 --help
Traceback (most recent call last):
File "/usr/local/bin/scoary2", line 5, in <module>
from scoary.scoary import main
File "/usr/local/lib/python3.10/site-packages/scoary/__init__.py", line 1, in <module>
from .scoary import scoary
File "/usr/local/lib/python3.10/site-packages/scoary/scoary.py", line 7, in <module>
from .analyze_trait import analyze_trait, worker
File "/usr/local/lib/python3.10/site-packages/scoary/analyze_trait.py", line 7, in <module>
from fast_fisher.fast_fisher_numba import odds_ratio, test1t as fisher_exact_two_tailed
File "/usr/local/lib/python3.10/site-packages/fast_fisher/fast_fisher_numba.py", line 5, in <module>
cc = CC('fast_fisher_compiled')
File "/usr/local/lib/python3.10/site-packages/numba/pycc/cc.py", line 65, in __init__
self._toolchain = Toolchain()
File "/usr/local/lib/python3.10/site-packages/numba/pycc/platform.py", line 78, in __init__
self._raise_external_compiler_error()
File "/usr/local/lib/python3.10/site-packages/numba/pycc/platform.py", line 121, in _raise_external_compiler_error
raise RuntimeError(msg)
RuntimeError: Attempted to compile AOT function without the compiler used by `numpy.distutils` present. If using conda try:
#> conda install gcc_linux-64 gxx_linux-64
I didn't know one could use tarballs like this! Good to know!
I created a StackOverflow post.
Hi again,
I have asked a co-worker for a conda environment that he used a few months ago in the same HPC environment. I installed the environment using mamba but I face the same issue. Therefore, it is not singularity. It might be the environment but gcc is present and, I have do as the error suggest and install gcc with mamba. I will try tomorrow to ask my co-worker if he can try to execute Scoary2 because I am puzzled.
Here is the enviroment: Scoary2_env.zip
$ which gcc
/usr/bin/gcc
$ gcc --version
gcc (GCC) 4.8.5 20150623 (Red Hat 4.8.5-44)
Copyright (C) 2015 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
$ mamba install gcc_linux-64 gxx_linux-64
$ conda list | grep gcc
_libgcc_mutex 0.1 conda_forge conda-forge
gcc_impl_linux-64 12.2.0 hcc96c02_19 conda-forge
gcc_linux-64 12.2.0 h4798a0e_13 conda-forge
libgcc-devel_linux-64 12.2.0 h3b97bd3_19 conda-forge
libgcc-ng 12.2.0 h65d4601_19 conda-forge
Thanks for the help!
Can you try executing regular numba code, for example, by following this tutorial?
Hi,
I have run the tutorial of numba inside of the singularity image and it worked.
$ singularity run scoary2_testing.sif
Python 3.10.7 (main, Sep 13 2022, 01:42:44) [GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from numba import jit
>>> import numpy as np
>>>
>>> x = np.arange(100).reshape(10, 10)
>>> @jit(nopython=True) # Set "nopython" mode for best performance, equivalent to @njit
... def go_fast(a): # Function is compiled to machine code when called the first time
... trace = 0.0
... for i in range(a.shape[0]): # Numba likes loops
... trace += np.tanh(a[i, i]) # Numba likes NumPy functions
... return a + trace # Numba likes NumPy broadcasting
...
>>> print(go_fast(x))
[[ 9. 10. 11. 12. 13. 14. 15. 16. 17. 18.]
[ 19. 20. 21. 22. 23. 24. 25. 26. 27. 28.]
[ 29. 30. 31. 32. 33. 34. 35. 36. 37. 38.]
[ 39. 40. 41. 42. 43. 44. 45. 46. 47. 48.]
[ 49. 50. 51. 52. 53. 54. 55. 56. 57. 58.]
[ 59. 60. 61. 62. 63. 64. 65. 66. 67. 68.]
[ 69. 70. 71. 72. 73. 74. 75. 76. 77. 78.]
[ 79. 80. 81. 82. 83. 84. 85. 86. 87. 88.]
[ 89. 90. 91. 92. 93. 94. 95. 96. 97. 98.]
[ 99. 100. 101. 102. 103. 104. 105. 106. 107. 108.]]
Right after I try running the help but it did not worked.
singularity run scoary2_testing.sif scoary2 --help
Traceback (most recent call last):
File "/usr/local/bin/scoary2", line 5, in <module>
from scoary.scoary import main
File "/usr/local/lib/python3.10/site-packages/scoary/__init__.py", line 1, in <module>
from .scoary import scoary
File "/usr/local/lib/python3.10/site-packages/scoary/scoary.py", line 7, in <module>
from .analyze_trait import analyze_trait, worker
File "/usr/local/lib/python3.10/site-packages/scoary/analyze_trait.py", line 7, in <module>
from fast_fisher.fast_fisher_numba import odds_ratio, test1t as fisher_exact_two_tailed
File "/usr/local/lib/python3.10/site-packages/fast_fisher/fast_fisher_numba.py", line 5, in <module>
cc = CC('fast_fisher_compiled')
File "/usr/local/lib/python3.10/site-packages/numba/pycc/cc.py", line 65, in __init__
self._toolchain = Toolchain()
File "/usr/local/lib/python3.10/site-packages/numba/pycc/platform.py", line 78, in __init__
self._raise_external_compiler_error()
File "/usr/local/lib/python3.10/site-packages/numba/pycc/platform.py", line 121, in _raise_external_compiler_error
raise RuntimeError(msg)
RuntimeError: Attempted to compile AOT function without the compiler used by `numpy.distutils` present. If using conda try:
#> conda install gcc_linux-64 gxx_linux-64
This also works when using the image provided by docker (scoary-2_latest.sif).
Thanks!
Maybe this helps: https://github.com/numba/numba/issues/7218
I found the error to happen in:
>>> cc = CC('fast_fisher_compiled')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.10/site-packages/numba/pycc/cc.py", line 65, in __init__
self._toolchain = Toolchain()
File "/usr/local/lib/python3.10/site-packages/numba/pycc/platform.py", line 78, in __init__
self._raise_external_compiler_error()
File "/usr/local/lib/python3.10/site-packages/numba/pycc/platform.py", line 121, in _raise_external_compiler_error
raise RuntimeError(msg)
RuntimeError: Attempted to compile AOT function without the compiler used by `numpy.distutils` present. If using conda try:
#> conda install gcc_linux-64 gxx_linux-64
Scoary uses Ahead-of-Time compilation, so can you try this tutorial, too?
Hi again,
Unfortunately, it did not work for any image. I also try using "fast_fisher_compiled" and it did not work. Here is an example of the error.
$ singularity run scoary-2_latest.sif
Python 3.10.7 (main, Sep 13 2022, 01:42:44) [GCC 10.2.1 20210110] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from numba.pycc import CC
>>> cc = CC('my_module')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.10/site-packages/numba/pycc/cc.py", line 65, in __init__
self._toolchain = Toolchain()
File "/usr/local/lib/python3.10/site-packages/numba/pycc/platform.py", line 78, in __init__
self._raise_external_compiler_error()
File "/usr/local/lib/python3.10/site-packages/numba/pycc/platform.py", line 121, in _raise_external_compiler_error
raise RuntimeError(msg)
RuntimeError: Attempted to compile AOT function without the compiler used by `numpy.distutils` present. If using conda try:
#> conda install gcc_linux-64 gxx_linux-64
Moreover, I can not import my_module (not sure to understand this part).
>>> import my_module
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'my_module'
I managed to reproduce the issue and found the problem!
On Fedora, I had to install install python-devel
and gcc-c++
! (I don't know what the equivalent of this is on Debian-based systems.)
I try installing the following in my environment:
mamba install -c conda-forge python-devtools
mamba install -c conda-forge gcc
Unfortunately, it di not worked. Also, I can not install directly into the server so I can not test your solution :( Have you updated the software?
Thanks for the effort :)
Are you sure that conda's gcc is being used? What is the output of which gcc
? I think you have to do something like ln -s path/to/conda/x86_64-conda_cos7-linux-gnu-gcc ~/.local/bin/gcc
to make your system actually use it.
Also, I've created a new Docker image:
# export SCOARY_VERSION=0.0.11
# podman build . --tag scoary2_test --build-arg SCOARY_VERSION=$SCOARY_VERSION
FROM continuumio/miniconda3
RUN conda install gcc_linux-64 gxx_linux-64 python=3.10
RUN ln -s /opt/conda/bin/x86_64-conda_cos7-linux-gnu-gcc /usr/bin/gcc
ARG SCOARY_VERSION
RUN pip install scoary-2==$SCOARY_VERSION && \
pip cache purge
# set these environment variables to directories where non-root is allowed to write
ENV NUMBA_CACHE_DIR=/tmp/NUMBA_CACHE_DIR
ENV CONFINT_DB=/tmp/CONFINT_DB
ENV MPLCONFIGDIR=/tmp/MPLCONFIGDIR
WORKDIR /data
You can download the Docker image like this: wget https://cloud.bioinformatics.unibe.ch/index.php/s/8NpdKCeHKy6Qycz/download/scoary2_test.tar.xz
md5sums:
.tar.xz
: dbc6b739ab3a2c992cf528c29bc56387
.tar
: ea68ff1654e366f64c739f0e1214db7a
Please test if this one works, it installs Scoary2 into the conda environment. I think singularity can load .tar
images like this: singularity build scoary2_test.sif docker-archive://scoary2_test.tar
Hi again,
The result of which gcc
is:
$ conda activate Scoary_2
$ which gcc
~/programas/minconda3.9/envs/Scoary_2/bin/gcc
$ gcc --version
gcc (conda-forge gcc 13.1.0-0) 13.1.0
Copyright (C) 2023 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
I tried ln -s ~/programas/minconda3.9/envs/Scoary_2/bin/gcc ~/.local/bin/gcc
but scoary still shows the same error.
Regarding the new image, I have done the following but with no success:
$ wget https://cloud.bioinformatics.unibe.ch/index.php/s/8NpdKCeHKy6Qycz/download/scoary2_test.tar.xz
$ md5sum scoary2_test.tar.xz # Correct
$ tar -xf scoary2_test.tar.xz
$ md5sum scoary2_test.tar # Correct
$ singularity build scoary2_test.sif docker-archive://scoary2_test.tar
$ singularity exec scoary2_test.sif scoary2 # Same error
$ singularity exec scoary2_test.sif /bin/bash
Singularity> which gcc
/usr/bin/gcc
Singularity> gcc --version
gcc (Anaconda gcc) 11.2.0
Copyright (C) 2021 Free Software Foundation, Inc.
This is free software; see the source for copying conditions. There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
I would suggest to close the issue, it seems you have been able to reproduce and identify the issue. I cannot modify my system since I have no privileges so I feel we are stuck. Nonetheless, I am still willing to help and I appreciate your effort.
Yes, let's close it for now... Thanks for the issue and the debugging, I'm sorry it didn't work out.
I'll tell you if I have another idea. If you find a solution, please share it!
Hi,
I am trying to get started with Scoary2 using Singularity in an HPC environment. However, I am facing some issues:
Thank you for the help
Best regards, Adrián