polio-nanopore / piranha

GNU General Public License v3.0
15 stars 4 forks source link

ERROR: Failed building wheel for medaka #223

Closed a-qvecell closed 3 months ago

a-qvecell commented 4 months ago

Hello :)

I would like to try running piranha and followed the very nice installation guide. It went smoothly until the step "pip install ." where I got the error: ERROR: Failed building wheel for medaka (full output below). It might be something related to our server, but maybe you have an idea of what causes this?

Thank you in advance, best regards, Amanda :)

pip install . Processing piranha Preparing metadata (setup.py) ... done Collecting mako==1.2 (from piranha==1.2.2) Using cached Mako-1.2.0-py3-none-any.whl.metadata (2.9 kB) Collecting pandas~=1.5 (from piranha==1.2.2) Using cached pandas-1.5.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (11 kB) Collecting snipit>=1.2 (from piranha==1.2.2) Using cached snipit-1.2-py3-none-any.whl.metadata (281 bytes) Requirement already satisfied: biopython in amqv/.local/lib/python3.10/site-packages (from piranha==1.2.2) (1.81) Collecting medaka>=1.7.1 (from piranha==1.2.2) Using cached medaka-1.11.3.tar.gz (16.8 MB) Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done Collecting numpy<=1.23.5 (from piranha==1.2.2) Using cached numpy-1.23.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.3 kB) Requirement already satisfied: scipy~=1.11 in amqv/.local/lib/python3.10/site-packages (from piranha==1.2.2) (1.11.4) Requirement already satisfied: MarkupSafe>=0.9.2 in piranha/lib/python3.10/site-packages (from mako==1.2->piranha==1.2.2) (2.1.5) Collecting cffi==1.15.0 (from medaka>=1.7.1->piranha==1.2.2) Using cached cffi-1.15.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl.metadata (1.2 kB) Collecting edlib (from medaka>=1.7.1->piranha==1.2.2) Using cached edlib-1.3.9-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (9.0 kB) Collecting grpcio (from medaka>=1.7.1->piranha==1.2.2) Using cached grpcio-1.62.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (4.0 kB) Collecting h5py (from medaka>=1.7.1->piranha==1.2.2) Using cached h5py-3.10.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.5 kB) Collecting intervaltree (from medaka>=1.7.1->piranha==1.2.2) Using cached intervaltree-3.1.0-py2.py3-none-any.whl Collecting tensorflow~=2.10.0 (from medaka>=1.7.1->piranha==1.2.2) Using cached tensorflow-2.10.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.1 kB) Collecting mappy (from medaka>=1.7.1->piranha==1.2.2) Using cached mappy-2.26-cp310-cp310-linux_x86_64.whl Collecting ont-fast5-api (from medaka>=1.7.1->piranha==1.2.2) Using cached ont_fast5_api-4.1.3-py3-none-any.whl.metadata (13 kB) Collecting parasail (from medaka>=1.7.1->piranha==1.2.2) Using cached parasail-1.3.4-py2.py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (21 kB) Requirement already satisfied: pysam>=0.16.0.1 in amqv/.local/lib/python3.10/site-packages (from medaka>=1.7.1->piranha==1.2.2) (0.22.0) Collecting pyspoa>=0.2.1 (from medaka>=1.7.1->piranha==1.2.2) Using cached pyspoa-0.2.1.tar.gz (52 kB) Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done Requirement already satisfied: requests in piranha/lib/python3.10/site-packages (from medaka>=1.7.1->piranha==1.2.2) (2.31.0) Collecting wurlitzer (from medaka>=1.7.1->piranha==1.2.2) Using cached wurlitzer-3.0.3-py3-none-any.whl.metadata (1.9 kB) Collecting pycparser (from cffi==1.15.0->medaka>=1.7.1->piranha==1.2.2) Using cached pycparser-2.21-py2.py3-none-any.whl.metadata (1.1 kB) Collecting python-dateutil>=2.8.1 (from pandas~=1.5->piranha==1.2.2) Using cached python_dateutil-2.8.2-py2.py3-none-any.whl.metadata (8.2 kB) Collecting pytz>=2020.1 (from pandas~=1.5->piranha==1.2.2) Using cached pytz-2024.1-py2.py3-none-any.whl.metadata (22 kB) Requirement already satisfied: matplotlib>=3.2.1 in amqv/.local/lib/python3.10/site-packages (from snipit>=1.2->piranha==1.2.2) (3.8.2) Requirement already satisfied: setuptools in piranha/lib/python3.10/site-packages (from snipit>=1.2->piranha==1.2.2) (69.1.1) Requirement already satisfied: contourpy>=1.0.1 in amqv/.local/lib/python3.10/site-packages (from matplotlib>=3.2.1->snipit>=1.2->piranha==1.2.2) (1.2.0) Requirement already satisfied: cycler>=0.10 in amqv/.local/lib/python3.10/site-packages (from matplotlib>=3.2.1->snipit>=1.2->piranha==1.2.2) (0.12.1) Requirement already satisfied: fonttools>=4.22.0 in amqv/.local/lib/python3.10/site-packages (from matplotlib>=3.2.1->snipit>=1.2->piranha==1.2.2) (4.46.0) Requirement already satisfied: kiwisolver>=1.3.1 in amqv/.local/lib/python3.10/site-packages (from matplotlib>=3.2.1->snipit>=1.2->piranha==1.2.2) (1.4.5) Requirement already satisfied: packaging>=20.0 in piranha/lib/python3.10/site-packages (from matplotlib>=3.2.1->snipit>=1.2->piranha==1.2.2) (23.2) Requirement already satisfied: pillow>=8 in amqv/.local/lib/python3.10/site-packages (from matplotlib>=3.2.1->snipit>=1.2->piranha==1.2.2) (10.1.0) Requirement already satisfied: pyparsing>=2.3.1 in amqv/.local/lib/python3.10/site-packages (from matplotlib>=3.2.1->snipit>=1.2->piranha==1.2.2) (3.1.1) Collecting six>=1.5 (from python-dateutil>=2.8.1->pandas~=1.5->piranha==1.2.2) Using cached six-1.16.0-py2.py3-none-any.whl.metadata (1.8 kB) Collecting absl-py>=1.0.0 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached absl_py-2.1.0-py3-none-any.whl.metadata (2.3 kB) Collecting astunparse>=1.6.0 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached astunparse-1.6.3-py2.py3-none-any.whl.metadata (4.4 kB) Collecting flatbuffers>=2.0 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached flatbuffers-23.5.26-py2.py3-none-any.whl.metadata (850 bytes) Collecting gast<=0.4.0,>=0.2.1 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached gast-0.4.0-py3-none-any.whl.metadata (1.1 kB) Collecting google-pasta>=0.1.1 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached google_pasta-0.2.0-py3-none-any.whl.metadata (814 bytes) Collecting keras<2.11,>=2.10.0 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached keras-2.10.0-py2.py3-none-any.whl.metadata (1.3 kB) Collecting keras-preprocessing>=1.1.1 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached Keras_Preprocessing-1.1.2-py2.py3-none-any.whl.metadata (1.9 kB) Collecting libclang>=13.0.0 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached libclang-16.0.6-py2.py3-none-manylinux2010_x86_64.whl.metadata (5.2 kB) Collecting opt-einsum>=2.3.2 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached opt_einsum-3.3.0-py3-none-any.whl.metadata (6.5 kB) Collecting protobuf<3.20,>=3.9.2 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached protobuf-3.19.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (787 bytes) Collecting tensorboard<2.11,>=2.10 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached tensorboard-2.10.1-py3-none-any.whl.metadata (1.9 kB) Collecting tensorflow-io-gcs-filesystem>=0.23.1 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached tensorflow_io_gcs_filesystem-0.36.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (14 kB) Collecting tensorflow-estimator<2.11,>=2.10.0 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached tensorflow_estimator-2.10.0-py2.py3-none-any.whl.metadata (1.3 kB) Collecting termcolor>=1.1.0 (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached termcolor-2.4.0-py3-none-any.whl.metadata (6.1 kB) Requirement already satisfied: typing-extensions>=3.6.6 in piranha/lib/python3.10/site-packages (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) (4.10.0) Requirement already satisfied: wrapt>=1.11.0 in piranha/lib/python3.10/site-packages (from tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) (1.16.0) Collecting sortedcontainers<3.0,>=2.0 (from intervaltree->medaka>=1.7.1->piranha==1.2.2) Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl.metadata (10 kB) Collecting progressbar33>=2.3.1 (from ont-fast5-api->medaka>=1.7.1->piranha==1.2.2) Using cached progressbar33-2.4-py3-none-any.whl Requirement already satisfied: charset-normalizer<4,>=2 in piranha/lib/python3.10/site-packages (from requests->medaka>=1.7.1->piranha==1.2.2) (3.3.2) Requirement already satisfied: idna<4,>=2.5 in piranha/lib/python3.10/site-packages (from requests->medaka>=1.7.1->piranha==1.2.2) (3.6) Requirement already satisfied: urllib3<3,>=1.21.1 in piranha/lib/python3.10/site-packages (from requests->medaka>=1.7.1->piranha==1.2.2) (2.2.1) Requirement already satisfied: certifi>=2017.4.17 in piranha/lib/python3.10/site-packages (from requests->medaka>=1.7.1->piranha==1.2.2) (2024.2.2) Requirement already satisfied: wheel<1.0,>=0.23.0 in piranha/lib/python3.10/site-packages (from astunparse>=1.6.0->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) (0.42.0) Collecting google-auth<3,>=1.6.3 (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached google_auth-2.28.1-py2.py3-none-any.whl.metadata (4.7 kB) Collecting google-auth-oauthlib<0.5,>=0.4.1 (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached google_auth_oauthlib-0.4.6-py2.py3-none-any.whl.metadata (2.7 kB) Collecting markdown>=2.6.8 (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached Markdown-3.5.2-py3-none-any.whl.metadata (7.0 kB) Collecting tensorboard-data-server<0.7.0,>=0.6.0 (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached tensorboard_data_server-0.6.1-py3-none-manylinux2010_x86_64.whl.metadata (1.1 kB) Collecting tensorboard-plugin-wit>=1.6.0 (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached tensorboard_plugin_wit-1.8.1-py3-none-any.whl.metadata (873 bytes) Collecting werkzeug>=1.0.1 (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached werkzeug-3.0.1-py3-none-any.whl.metadata (4.1 kB) Collecting cachetools<6.0,>=2.0.0 (from google-auth<3,>=1.6.3->tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached cachetools-5.3.3-py3-none-any.whl.metadata (5.3 kB) Collecting pyasn1-modules>=0.2.1 (from google-auth<3,>=1.6.3->tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl.metadata (3.6 kB) Collecting rsa<5,>=3.1.4 (from google-auth<3,>=1.6.3->tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached rsa-4.9-py3-none-any.whl.metadata (4.2 kB) Collecting requests-oauthlib>=0.7.0 (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl.metadata (10 kB) Collecting pyasn1<0.6.0,>=0.4.6 (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached pyasn1-0.5.1-py2.py3-none-any.whl.metadata (8.6 kB) Collecting oauthlib>=3.0.0 (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.11,>=2.10->tensorflow~=2.10.0->medaka>=1.7.1->piranha==1.2.2) Using cached oauthlib-3.2.2-py3-none-any.whl.metadata (7.5 kB) Using cached Mako-1.2.0-py3-none-any.whl (78 kB) Using cached cffi-1.15.0-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (446 kB) Using cached numpy-1.23.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.1 MB) Using cached pandas-1.5.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.1 MB) Using cached snipit-1.2-py3-none-any.whl (29 kB) Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB) Using cached pytz-2024.1-py2.py3-none-any.whl (505 kB) Using cached tensorflow-2.10.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (578.1 MB) Using cached grpcio-1.62.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.5 MB) Using cached h5py-3.10.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.8 MB) Using cached edlib-1.3.9-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (363 kB) Using cached ont_fast5_api-4.1.3-py3-none-any.whl (2.3 MB) Using cached parasail-1.3.4-py2.py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (15.6 MB) Using cached wurlitzer-3.0.3-py3-none-any.whl (7.3 kB) Using cached absl_py-2.1.0-py3-none-any.whl (133 kB) Using cached astunparse-1.6.3-py2.py3-none-any.whl (12 kB) Using cached flatbuffers-23.5.26-py2.py3-none-any.whl (26 kB) Using cached gast-0.4.0-py3-none-any.whl (9.8 kB) Using cached google_pasta-0.2.0-py3-none-any.whl (57 kB) Using cached keras-2.10.0-py2.py3-none-any.whl (1.7 MB) Using cached Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB) Using cached libclang-16.0.6-py2.py3-none-manylinux2010_x86_64.whl (22.9 MB) Using cached opt_einsum-3.3.0-py3-none-any.whl (65 kB) Using cached protobuf-3.19.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB) Using cached six-1.16.0-py2.py3-none-any.whl (11 kB) Using cached sortedcontainers-2.4.0-py2.py3-none-any.whl (29 kB) Using cached tensorboard-2.10.1-py3-none-any.whl (5.9 MB) Using cached tensorflow_estimator-2.10.0-py2.py3-none-any.whl (438 kB) Using cached tensorflow_io_gcs_filesystem-0.36.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.1 MB) Using cached termcolor-2.4.0-py3-none-any.whl (7.7 kB) Using cached pycparser-2.21-py2.py3-none-any.whl (118 kB) Using cached google_auth-2.28.1-py2.py3-none-any.whl (186 kB) Using cached google_auth_oauthlib-0.4.6-py2.py3-none-any.whl (18 kB) Using cached Markdown-3.5.2-py3-none-any.whl (103 kB) Using cached tensorboard_data_server-0.6.1-py3-none-manylinux2010_x86_64.whl (4.9 MB) Using cached tensorboard_plugin_wit-1.8.1-py3-none-any.whl (781 kB) Using cached werkzeug-3.0.1-py3-none-any.whl (226 kB) Using cached cachetools-5.3.3-py3-none-any.whl (9.3 kB) Using cached pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB) Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB) Using cached rsa-4.9-py3-none-any.whl (34 kB) Using cached oauthlib-3.2.2-py3-none-any.whl (151 kB) Using cached pyasn1-0.5.1-py2.py3-none-any.whl (84 kB) Building wheels for collected packages: piranha, medaka, pyspoa Building wheel for piranha (setup.py) ... done Created wheel for piranha: filename=piranha-1.2.2-py3-none-any.whl size=480641 sha256=6ce67042c7ade86de75a49fc425f92aad5ccea526ea13fb928df167fefb7f356 Stored in directory: /tmp/pip-ephem-wheel-cache-myigcywu/wheels/be/d9/75/a91153899d7c169882e8203b1d64c3fd29545e780c4e95c722 Building wheel for medaka (pyproject.toml) ... error error: subprocess-exited-with-error

× Building wheel for medaka (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [118 lines of output] Cannot import pyabpoa, some features may not be available. Cannot import wurlitzer, some features may not be available. /tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/cffi/cparser.py:154: UserWarning: String literal found in cdef() or type source. String literals are ignored here, but you should remove them anyway because some character sequences confuse pre-parsing. warnings.warn("String literal found in cdef() or type source. " Bundling models: ['r103_fast_g507', 'r103_fast_snp_g507', 'r103_fast_variant_g507', 'r103_hac_g507', 'r103_hac_snp_g507', 'r103_hac_variant_g507', 'r103_min_high_g345', 'r103_min_high_g360', 'r103_prom_high_g360', 'r103_prom_snp_g3210', 'r103_prom_variant_g3210', 'r103_sup_g507', 'r103_sup_snp_g507', 'r103_sup_variant_g507', 'r1041_e82_260bps_fast_g632', 'r1041_e82_260bps_fast_variant_g632', 'r1041_e82_260bps_hac_g632', 'r1041_e82_260bps_hac_v4.0.0', 'r1041_e82_260bps_hac_v4.1.0', 'r1041_e82_260bps_hac_variant_g632', 'r1041_e82_260bps_hac_variant_v4.1.0', 'r1041_e82_260bps_sup_g632', 'r1041_e82_260bps_sup_v4.0.0', 'r1041_e82_260bps_sup_v4.1.0', 'r1041_e82_260bps_sup_variant_g632', 'r1041_e82_260bps_sup_variant_v4.1.0', 'r1041_e82_400bps_fast_g615', 'r1041_e82_400bps_fast_g632', 'r1041_e82_400bps_fast_variant_g615', 'r1041_e82_400bps_fast_variant_g632', 'r1041_e82_400bps_hac_g615', 'r1041_e82_400bps_hac_g632', 'r1041_e82_400bps_hac_v4.0.0', 'r1041_e82_400bps_hac_v4.1.0', 'r1041_e82_400bps_hac_v4.2.0', 'r1041_e82_400bps_hac_v4.3.0', 'r1041_e82_400bps_hac_variant_g615', 'r1041_e82_400bps_hac_variant_g632', 'r1041_e82_400bps_hac_variant_v4.1.0', 'r1041_e82_400bps_hac_variant_v4.2.0', 'r1041_e82_400bps_hac_variant_v4.3.0', 'r1041_e82_400bps_sup_g615', 'r1041_e82_400bps_sup_v4.0.0', 'r1041_e82_400bps_sup_v4.1.0', 'r1041_e82_400bps_sup_v4.2.0', 'r1041_e82_400bps_sup_v4.3.0', 'r1041_e82_400bps_sup_variant_g615', 'r1041_e82_400bps_sup_variant_v4.1.0', 'r1041_e82_400bps_sup_variant_v4.2.0', 'r1041_e82_400bps_sup_variant_v4.3.0', 'r104_e81_fast_g5015', 'r104_e81_fast_variant_g5015', 'r104_e81_hac_g5015', 'r104_e81_hac_variant_g5015', 'r104_e81_sup_g5015', 'r104_e81_sup_g610', 'r104_e81_sup_variant_g610', 'r10_min_high_g303', 'r10_min_high_g340', 'r941_e81_fast_g514', 'r941_e81_fast_variant_g514', 'r941_e81_hac_g514', 'r941_e81_hac_variant_g514', 'r941_e81_sup_g514', 'r941_e81_sup_variant_g514', 'r941_min_fast_g303', 'r941_min_fast_g507', 'r941_min_fast_snp_g507', 'r941_min_fast_variant_g507', 'r941_min_hac_g507', 'r941_min_hac_snp_g507', 'r941_min_hac_variant_g507', 'r941_min_high_g303', 'r941_min_high_g330', 'r941_min_high_g340_rle', 'r941_min_high_g344', 'r941_min_high_g351', 'r941_min_high_g360', 'r941_min_sup_g507', 'r941_min_sup_snp_g507', 'r941_min_sup_variant_g507', 'r941_prom_fast_g303', 'r941_prom_fast_g507', 'r941_prom_fast_snp_g507', 'r941_prom_fast_variant_g507', 'r941_prom_hac_g507', 'r941_prom_hac_snp_g507', 'r941_prom_hac_variant_g507', 'r941_prom_high_g303', 'r941_prom_high_g330', 'r941_prom_high_g344', 'r941_prom_high_g360', 'r941_prom_high_g4011', 'r941_prom_snp_g303', 'r941_prom_snp_g322', 'r941_prom_snp_g360', 'r941_prom_sup_g507', 'r941_prom_sup_snp_g507', 'r941_prom_sup_variant_g507', 'r941_prom_variant_g303', 'r941_prom_variant_g322', 'r941_prom_variant_g360', 'r941_sup_plant_g610', 'r941_sup_plant_variant_g610'] running bdist_wheel running build running build_py creating build creating build/lib.linux-x86_64-cpython-310 creating build/lib.linux-x86_64-cpython-310/medaka copying medaka/init.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/align.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/common.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/datastore.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/executor.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/features.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/keras_ext.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/labels.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/medaka.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/medaka_counts.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/models.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/options.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/prediction.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/rle.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/smolecule.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/stitch.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/tandem.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/training.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/variant.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/vcf.py -> build/lib.linux-x86_64-cpython-310/medaka copying medaka/wrappers.py -> build/lib.linux-x86_64-cpython-310/medaka creating build/lib.linux-x86_64-cpython-310/medaka/data copying medaka/data/r1041_e82_400bps_hac_v4.3.0_model.tar.gz -> build/lib.linux-x86_64-cpython-310/medaka/data copying medaka/data/r1041_e82_400bps_hac_variant_v4.3.0_model.tar.gz -> build/lib.linux-x86_64-cpython-310/medaka/data copying medaka/data/r1041_e82_400bps_sup_v4.3.0_model.tar.gz -> build/lib.linux-x86_64-cpython-310/medaka/data copying medaka/data/r1041_e82_400bps_sup_variant_v4.3.0_model.tar.gz -> build/lib.linux-x86_64-cpython-310/medaka/data running build_ext generating cffi module 'build/temp.linux-x86_64-cpython-310/libmedaka.c' creating build/temp.linux-x86_64-cpython-310 Compiling htslib using Makefile cd submodules; \ curl -L -o samtools-1.14.tar.bz2 https://github.com/samtools/samtools/releases/download/1.14/samtools-1.14.tar.bz2; \ tar -xjf samtools-1.14.tar.bz2; \ rm samtools-1.14.tar.bz2 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed

    0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
    0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0

  100 7563k  100 7563k    0     0  8588k      0 --:--:-- --:--:-- --:--:-- 8588k
  # this is required only to add in -fpic so we can build python module
  \x1b[1;33mMaking libhts.a\x1b[0m
  cd submodules/samtools-1.14/htslib-1.14/ \
      && CFLAGS="-fpic -O3 -std=c99 -mtune=haswell" ./configure  \
      && make -j 4
  checking for gcc... gcc
  checking whether the C compiler works... no
  configure: error: in `/tmp/pip-install-51bonxeu/medaka_33297f14cdd047de835e25e435f9eaf6/submodules/samtools-1.14/htslib-1.14':
  configure: error: C compiler cannot create executables
  See `config.log' for more details
  make: *** [libhts.a] Error 77
  Traceback (most recent call last):
    File "piranha/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
      main()
    File "piranha/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
      json_out['return_val'] = hook(**hook_input['kwargs'])
    File "piranha/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 251, in build_wheel
      return _build_backend().build_wheel(wheel_directory, config_settings,
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 410, in build_wheel
      return self._build_with_temp_dir(
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 395, in _build_with_temp_dir
      self.run_setup()
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 487, in run_setup
      super().run_setup(setup_script=setup_script)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 311, in run_setup
      exec(code, locals())
    File "<string>", line 118, in <module>
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/__init__.py", line 103, in setup
      return distutils.core.setup(**attrs)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 185, in setup
      return run_commands(dist)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 201, in run_commands
      dist.run_commands()
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 969, in run_commands
      self.run_command(cmd)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/dist.py", line 963, in run_command
      super().run_command(command)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
      cmd_obj.run()
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/wheel/bdist_wheel.py", line 368, in run
      self.run_command("build")
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
      self.distribution.run_command(command)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/dist.py", line 963, in run_command
      super().run_command(command)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
      cmd_obj.run()
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/command/build.py", line 131, in run
      self.run_command(cmd_name)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 318, in run_command
      self.distribution.run_command(command)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/dist.py", line 963, in run_command
      super().run_command(command)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 988, in run_command
      cmd_obj.run()
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/cffi/setuptools_ext.py", line 144, in run
      base_class.run(self)
    File "<string>", line 105, in run
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 339, in execute
      util.execute(func, args, msg, dry_run=self.dry_run)
    File "/tmp/pip-build-env-7_b72fhc/overlay/lib/python3.10/site-packages/setuptools/_distutils/util.py", line 337, in execute
      func(*args)
    File "<string>", line 103, in compile_hts
    File "piranha/lib/python3.10/subprocess.py", line 369, in check_call
      raise CalledProcessError(retcode, cmd)
  subprocess.CalledProcessError: Command '['make', 'libhts.a']' returned non-zero exit status 2.
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for medaka Building wheel for pyspoa (pyproject.toml) ... error error: subprocess-exited-with-error

× Building wheel for pyspoa (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> [37 lines of output] running bdist_wheel running build running build_ext -- The CXX compiler identification is GNU 4.8.5 -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Check for working CXX compiler: /usr/bin/c++ - skipped -- Detecting CXX compile features -- Detecting CXX compile features - done CMake Error at /tmp/pip-build-env-07_nfu2d/overlay/lib/python3.10/site-packages/cmake/data/share/cmake-3.28/Modules/FindPackageHandleStandardArgs.cmake:230 (message): Could NOT find ZLIB: Found unsuitable version "1.2.7", but required is at least "1.2.8" (found /usr/lib64/libz.so, ) Call Stack (most recent call first): /tmp/pip-build-env-07_nfu2d/overlay/lib/python3.10/site-packages/cmake/data/share/cmake-3.28/Modules/FindPackageHandleStandardArgs.cmake:598 (_FPHSA_FAILURE_MESSAGE) /tmp/pip-build-env-07_nfu2d/overlay/lib/python3.10/site-packages/cmake/data/share/cmake-3.28/Modules/FindZLIB.cmake:199 (FIND_PACKAGE_HANDLE_STANDARD_ARGS) build/_deps/bioparser-src/CMakeLists.txt:22 (find_package)

  -- Configuring incomplete, errors occurred!
  make: *** No targets specified and no makefile found.  Stop.
  creating tmp
  gcc -pthread -B piranha/compiler_compat -Wno-unused-result -Wsign-compare -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem piranha/include -fPIC -O2 -isystem piranha/include -fPIC -Ipiranha/include/python3.10 -c /tmp/tmpst2h2pyy.cpp -o tmp/tmpst2h2pyy.o -std=c++14
  gcc: error: unrecognized command line option ‘-std=c++14’
  gcc -pthread -B piranha/compiler_compat -Wno-unused-result -Wsign-compare -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem piranha/include -fPIC -O2 -isystem piranha/include -fPIC -Ipiranha/include/python3.10 -c /tmp/tmp85uw8h0q.cpp -o tmp/tmp85uw8h0q.o -std=c++11
  gcc -pthread -B piranha/compiler_compat -Wno-unused-result -Wsign-compare -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem piranha/include -fPIC -O2 -isystem piranha/include -fPIC -Ipiranha/include/python3.10 -c /tmp/tmpqrrt235j.cpp -o tmp/tmpqrrt235j.o -fvisibility=hidden
  building 'spoa' extension
  creating build
  creating build/temp.linux-x86_64-cpython-310
  gcc -pthread -B piranha/compiler_compat -Wno-unused-result -Wsign-compare -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem piranha/include -fPIC -O2 -isystem piranha/include -fPIC -Isrc/include/spoa -Isrc/vendor/cereal/include -I/tmp/pip-build-env-07_nfu2d/overlay/lib/python3.10/site-packages/pybind11/include -I/tmp/pip-build-env-07_nfu2d/overlay/lib/python3.10/site-packages/pybind11/include -Ipiranha/include/python3.10 -c pyspoa.cpp -o build/temp.linux-x86_64-cpython-310/pyspoa.o -DVERSION_INFO=\"0.2.1\" -std=c++11 -fvisibility=hidden
  pyspoa.cpp: In function ‘pybind11::tuple poa(std::vector<std::basic_string<char> >, int, bool, int, int, int, int, int, int, pybind11::object)’:
  pyspoa.cpp:14:40: warning: ‘bool pybind11::handle::operator!=(const pybind11::handle&) const’ is deprecated (declared at /tmp/pip-build-env-07_nfu2d/overlay/lib/python3.10/site-packages/pybind11/include/pybind11/detail/../detail/../pytypes.h:292): Use !obj1.is(obj2) instead [-Wdeprecated-declarations]
       if (min_coverage != pybind11::none()) {
                                          ^
  creating build/lib.linux-x86_64-cpython-310
  g++ -pthread -B piranha/compiler_compat -shared -Wl,--allow-shlib-undefined -Wl,-rpath,piranha/lib -Wl,-rpath-link,piranha/lib -Lpiranha/lib -Wl,--allow-shlib-undefined -Wl,-rpath,piranha/lib -Wl,-rpath-link,piranha/lib -Lpiranha/lib build/temp.linux-x86_64-cpython-310/pyspoa.o src/build/lib/libspoa.a -o build/lib.linux-x86_64-cpython-310/spoa.cpython-310-x86_64-linux-gnu.so
  g++: error: src/build/lib/libspoa.a: No such file or directory
  error: command '/usr/bin/g++' failed with exit code 1
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for pyspoa Successfully built piranha Failed to build medaka pyspoa ERROR: Could not build wheels for medaka, pyspoa, which is required to install pyproject.toml-based projects

aineniamh commented 4 months ago

Hi Amanda,

That's an impressive pip error 😆

So I'm not completely sure what the solution is, but it's definitely struggling to install parts of the things associated with installing medaka.

The thing that stands out to me is error: command '/usr/bin/g++' failed with exit code 1, which suggests it's trying to do the compiling using your machine's native g++ install. In the piranha conda environment we specify an install of coreutils that should allow a specific installed version of the compiler to be used, but it appears that's not what's being used.

Can you check you're running the pip command inside the piranha environment? If not, there might be a path priority issue.

a-qvecell commented 4 months ago

Yes, thank you 😅 I can confirm that I am running the install command with the environment sourced, so I don't know why it is not using the correct compiler 😢 Hmmm... I will ask around if anyone on the cluster has experienced something similar, but if you have any hints, please just let me know 😄

aineniamh commented 3 months ago

Hmm, you could try to run conda install coreutils=9.1 -c conda-forge to ensure that it is installed in the environment, or alternatively (possibly a better solution) try installing direct through bioconda (instructions here)

mamba create --name myenvname piranha-polio
conda activate piranha-polio
mamba install piranha-polio -c bioconda -c conda-forge

That should work then 🤞

a-qvecell commented 3 months ago

piranha --version now gives me 1.2.2, so looks like it worked 👍 Thank you!

aineniamh commented 3 months ago

Ah excellent! Glad it worked!

a-qvecell commented 3 months ago

However, this means that I cannot use features implemented since the last release in November, eg. -rg species, right? Are you planning on making a new release anytime soon? :)

aineniamh commented 3 months ago

I've been working on an update that'll better handle diverse wild polio sequences, and am almost there- I plan to do a release when that's ready, so hopefully in the next week or two.

You could try install the main of piranha on top of the install from bioconda, as the dependency compilation may have been carried out already now?