sissa-data-science / DADApy

Distance-based Analysis of DAta-manifolds in python
https://dadapy.readthedocs.io/
Apache License 2.0
97 stars 16 forks source link

Installation goes wrong in virtual environment #132

Open mascaretti opened 1 week ago

mascaretti commented 1 week ago

Subject of the issue

Installation fails when working in a virtual environment. I have created a virtual environment using Mamba, then attempted to install using pip. Neither pip install dadapy nor pip install git+https://github.com/sissa-data-science/DADApy seems to work. The errors I get are due to Cython failing to compile.

My environment

Steps to reproduce

mamba create -n "ddp"
mamba activate ddp
mamba install pip
pip install git+https://github.com/sissa-data-science/DADApy

Expected behaviour

Installation is successful.

Actual behaviour

Collecting git+https://github.com/sissa-data-science/DADApy
  Cloning https://github.com/sissa-data-science/DADApy to /tmp/pip-req-build-29idjp1x
  Running command git clone --filter=blob:none --quiet https://github.com/sissa-data-science/DADApy /tmp/pip-req-build-29idjp1x
  Resolved https://github.com/sissa-data-science/DADApy to commit a1e50f877e007a7cd77877749e14de9ce31b8494
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Collecting numpy (from dadapy==0.3.0)
  Using cached numpy-2.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (60 kB)
Collecting scipy (from dadapy==0.3.0)
  Using cached scipy-1.14.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (60 kB)
Collecting scikit-learn (from dadapy==0.3.0)
  Using cached scikit_learn-1.5.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (11 kB)
Collecting matplotlib (from dadapy==0.3.0)
  Using cached matplotlib-3.9.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (11 kB)
Collecting seaborn (from dadapy==0.3.0)
  Using cached seaborn-0.13.2-py3-none-any.whl.metadata (5.4 kB)
Collecting contourpy>=1.0.1 (from matplotlib->dadapy==0.3.0)
  Using cached contourpy-1.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (5.8 kB)
Collecting cycler>=0.10 (from matplotlib->dadapy==0.3.0)
  Using cached cycler-0.12.1-py3-none-any.whl.metadata (3.8 kB)
Collecting fonttools>=4.22.0 (from matplotlib->dadapy==0.3.0)
  Using cached fonttools-4.53.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (162 kB)
Collecting kiwisolver>=1.3.1 (from matplotlib->dadapy==0.3.0)
  Using cached kiwisolver-1.4.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.4 kB)
Collecting packaging>=20.0 (from matplotlib->dadapy==0.3.0)
  Using cached packaging-24.1-py3-none-any.whl.metadata (3.2 kB)
Collecting pillow>=8 (from matplotlib->dadapy==0.3.0)
  Using cached pillow-10.3.0-cp312-cp312-manylinux_2_28_x86_64.whl.metadata (9.2 kB)
Collecting pyparsing>=2.3.1 (from matplotlib->dadapy==0.3.0)
  Using cached pyparsing-3.1.2-py3-none-any.whl.metadata (5.1 kB)
Collecting python-dateutil>=2.7 (from matplotlib->dadapy==0.3.0)
  Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl.metadata (8.4 kB)
Collecting joblib>=1.2.0 (from scikit-learn->dadapy==0.3.0)
  Using cached joblib-1.4.2-py3-none-any.whl.metadata (5.4 kB)
Collecting threadpoolctl>=3.1.0 (from scikit-learn->dadapy==0.3.0)
  Using cached threadpoolctl-3.5.0-py3-none-any.whl.metadata (13 kB)
Collecting pandas>=1.2 (from seaborn->dadapy==0.3.0)
  Using cached pandas-2.2.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (19 kB)
Collecting pytz>=2020.1 (from pandas>=1.2->seaborn->dadapy==0.3.0)
  Using cached pytz-2024.1-py2.py3-none-any.whl.metadata (22 kB)
Collecting tzdata>=2022.7 (from pandas>=1.2->seaborn->dadapy==0.3.0)
  Using cached tzdata-2024.1-py2.py3-none-any.whl.metadata (1.4 kB)
Collecting six>=1.5 (from python-dateutil>=2.7->matplotlib->dadapy==0.3.0)
  Using cached six-1.16.0-py2.py3-none-any.whl.metadata (1.8 kB)
Using cached matplotlib-3.9.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (8.3 MB)
Using cached numpy-2.0.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (19.0 MB)
Using cached scikit_learn-1.5.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (13.1 MB)
Using cached scipy-1.14.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (40.8 MB)
Using cached seaborn-0.13.2-py3-none-any.whl (294 kB)
Using cached contourpy-1.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (309 kB)
Using cached cycler-0.12.1-py3-none-any.whl (8.3 kB)
Using cached fonttools-4.53.0-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (4.9 MB)
Using cached joblib-1.4.2-py3-none-any.whl (301 kB)
Using cached kiwisolver-1.4.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.5 MB)
Using cached packaging-24.1-py3-none-any.whl (53 kB)
Using cached pandas-2.2.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.7 MB)
Using cached pillow-10.3.0-cp312-cp312-manylinux_2_28_x86_64.whl (4.5 MB)
Using cached pyparsing-3.1.2-py3-none-any.whl (103 kB)
Using cached python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB)
Using cached threadpoolctl-3.5.0-py3-none-any.whl (18 kB)
Using cached pytz-2024.1-py2.py3-none-any.whl (505 kB)
Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Using cached tzdata-2024.1-py2.py3-none-any.whl (345 kB)
Building wheels for collected packages: dadapy
  Building wheel for dadapy (pyproject.toml) ... error
  error: subprocess-exited-with-error

  × Building wheel for dadapy (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [163 lines of output]
      OpenMP supported
      running bdist_wheel
      running build
      running build_py
      creating build
      creating build/lib.linux-x86_64-cpython-312
      creating build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/__init__.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/base.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/clustering.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/data.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/data_sets.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/density_advanced.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/density_estimation.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/feature_weighting.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/id_discrete.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/id_estimation.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/kstar.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/metric_comparisons.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/neigh_graph.py -> build/lib.linux-x86_64-cpython-312/dadapy
      copying dadapy/plot.py -> build/lib.linux-x86_64-cpython-312/dadapy
      creating build/lib.linux-x86_64-cpython-312/dadapy/_utils
      copying dadapy/_utils/__init__.py -> build/lib.linux-x86_64-cpython-312/dadapy/_utils
      copying dadapy/_utils/density_estimation.py -> build/lib.linux-x86_64-cpython-312/dadapy/_utils
      copying dadapy/_utils/differentiable_imbalance.py -> build/lib.linux-x86_64-cpython-312/dadapy/_utils
      copying dadapy/_utils/discrete_functions.py -> build/lib.linux-x86_64-cpython-312/dadapy/_utils
      copying dadapy/_utils/id_estimation.py -> build/lib.linux-x86_64-cpython-312/dadapy/_utils
      copying dadapy/_utils/metric_comparisons.py -> build/lib.linux-x86_64-cpython-312/dadapy/_utils
      copying dadapy/_utils/utils.py -> build/lib.linux-x86_64-cpython-312/dadapy/_utils
      running egg_info
      writing dadapy.egg-info/PKG-INFO
      writing dependency_links to dadapy.egg-info/dependency_links.txt
      writing requirements to dadapy.egg-info/requires.txt
      writing top-level names to dadapy.egg-info/top_level.txt
      reading manifest file 'dadapy.egg-info/SOURCES.txt'
      writing manifest file 'dadapy.egg-info/SOURCES.txt'
      /tmp/pip-build-env-fhw316zr/overlay/lib/python3.12/site-packages/setuptools/command/build_py.py:215: _Warning: Package 'dadapy._cython' is absent from the `packages` configuration.
      !!

              ********************************************************************************
              ############################
              # Package would be ignored #
              ############################
              Python recognizes 'dadapy._cython' as an importable package[^1],
              but it is absent from setuptools' `packages` configuration.

              This leads to an ambiguous overall configuration. If you want to distribute this
              package, please make sure that 'dadapy._cython' is explicitly added
              to the `packages` configuration field.

              Alternatively, you can also rely on setuptools' discovery methods
              (for example by using `find_namespace_packages(...)`/`find_namespace:`
              instead of `find_packages(...)`/`find:`).

              You can read more about "package discovery" on setuptools documentation page:

              - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html

              If you don't want 'dadapy._cython' to be distributed and are
              already explicitly excluding 'dadapy._cython' via
              `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`,
              you can try to use `exclude_package_data`, or `include-package-data=False` in
              combination with a more fine grained `package-data` configuration.

              You can read more about "package data files" on setuptools documentation page:

              - https://setuptools.pypa.io/en/latest/userguide/datafiles.html

              [^1]: For Python, any directory (with suitable naming) can be imported,
                    even if it does not contain any `.py` files.
                    On the other hand, currently there is no concept of package data
                    directory, all directories are treated like packages.
              ********************************************************************************

      !!
        check.warn(importable)
      /tmp/pip-build-env-fhw316zr/overlay/lib/python3.12/site-packages/setuptools/command/build_py.py:215: _Warning: Package 'dadapy._utils.discrete_volumes' is absent from the `packages` configuration.
      !!

              ********************************************************************************
              ############################
              # Package would be ignored #
              ############################
              Python recognizes 'dadapy._utils.discrete_volumes' as an importable package[^1],
              but it is absent from setuptools' `packages` configuration.

              This leads to an ambiguous overall configuration. If you want to distribute this
              package, please make sure that 'dadapy._utils.discrete_volumes' is explicitly added
              to the `packages` configuration field.

              Alternatively, you can also rely on setuptools' discovery methods
              (for example by using `find_namespace_packages(...)`/`find_namespace:`
              instead of `find_packages(...)`/`find:`).

              You can read more about "package discovery" on setuptools documentation page:

              - https://setuptools.pypa.io/en/latest/userguide/package_discovery.html

              If you don't want 'dadapy._utils.discrete_volumes' to be distributed and are
              already explicitly excluding 'dadapy._utils.discrete_volumes' via
              `find_namespace_packages(...)/find_namespace` or `find_packages(...)/find`,
              you can try to use `exclude_package_data`, or `include-package-data=False` in
              combination with a more fine grained `package-data` configuration.

              You can read more about "package data files" on setuptools documentation page:

              - https://setuptools.pypa.io/en/latest/userguide/datafiles.html

              [^1]: For Python, any directory (with suitable naming) can be imported,
                    even if it does not contain any `.py` files.
                    On the other hand, currently there is no concept of package data
                    directory, all directories are treated like packages.
              ********************************************************************************

      !!
        check.warn(importable)
      creating build/lib.linux-x86_64-cpython-312/dadapy/_cython
      copying dadapy/_cython/cython_clustering.c -> build/lib.linux-x86_64-cpython-312/dadapy/_cython
      copying dadapy/_cython/cython_clustering_v2.c -> build/lib.linux-x86_64-cpython-312/dadapy/_cython
      copying dadapy/_cython/cython_density.c -> build/lib.linux-x86_64-cpython-312/dadapy/_cython
      copying dadapy/_cython/cython_differentiable_imbalance.c -> build/lib.linux-x86_64-cpython-312/dadapy/_cython
      copying dadapy/_cython/cython_distances.c -> build/lib.linux-x86_64-cpython-312/dadapy/_cython
      copying dadapy/_cython/cython_grads.c -> build/lib.linux-x86_64-cpython-312/dadapy/_cython
      copying dadapy/_cython/cython_maximum_likelihood_opt.c -> build/lib.linux-x86_64-cpython-312/dadapy/_cython
      copying dadapy/_cython/cython_maximum_likelihood_opt_full.c -> build/lib.linux-x86_64-cpython-312/dadapy/_cython
      copying dadapy/_cython/cython_overlap.c -> build/lib.linux-x86_64-cpython-312/dadapy/_cython
      creating build/lib.linux-x86_64-cpython-312/dadapy/_utils/discrete_volumes
      copying dadapy/_utils/discrete_volumes/L_coefficients_exact.dat -> build/lib.linux-x86_64-cpython-312/dadapy/_utils/discrete_volumes
      copying dadapy/_utils/discrete_volumes/L_coefficients_float.dat -> build/lib.linux-x86_64-cpython-312/dadapy/_utils/discrete_volumes
      copying dadapy/_utils/discrete_volumes/V_exact.dat -> build/lib.linux-x86_64-cpython-312/dadapy/_utils/discrete_volumes
      copying dadapy/_utils/discrete_volumes/L_coefficients_exact.dat -> build/lib.linux-x86_64-cpython-312/dadapy/_utils/discrete_volumes
      copying dadapy/_utils/discrete_volumes/L_coefficients_float.dat -> build/lib.linux-x86_64-cpython-312/dadapy/_utils/discrete_volumes
      copying dadapy/_utils/discrete_volumes/V_exact.dat -> build/lib.linux-x86_64-cpython-312/dadapy/_utils/discrete_volumes
      running build_ext
      building 'dadapy._cython.cython_clustering' extension
      creating build/temp.linux-x86_64-cpython-312
      creating build/temp.linux-x86_64-cpython-312/dadapy
      creating build/temp.linux-x86_64-cpython-312/dadapy/_cython
      gcc -pthread -B /home/masca/miniforge3/envs/sissa/compiler_compat -fno-strict-overflow -DNDEBUG -O2 -Wall -fPIC -O2 -isystem /home/masca/miniforge3/envs/sissa/include -fPIC -O2 -isystem /home/masca/miniforge3/envs/sissa/include -fPIC -DNPY_NO_DEPRECATED_API=NPY_1_7_API_VERSION -I/tmp/pip-build-env-fhw316zr/overlay/lib/python3.12/site-packages/numpy/_core/include -I/home/masca/miniforge3/envs/sissa/include/python3.12 -c dadapy/_cython/cython_clustering.c -o build/temp.linux-x86_64-cpython-312/dadapy/_cython/cython_clustering.o
      In file included from /home/masca/miniforge3/envs/sissa/include/python3.12/Python.h:38,
                       from dadapy/_cython/cython_clustering.c:35:
      dadapy/_cython/cython_clustering.c: In function ‘__pyx_f_5numpy_PyDataType_SHAPE’:
      dadapy/_cython/cython_clustering.c:4232:39: error: ‘PyArray_Descr’ {aka ‘struct _PyArray_Descr’} has no member named ‘subarray’
       4232 |     __Pyx_INCREF(((PyObject*)__pyx_v_d->subarray->shape));
            |                                       ^~
      /home/masca/miniforge3/envs/sissa/include/python3.12/pyport.h:24:38: note: in definition of macro ‘_Py_CAST’
         24 | #define _Py_CAST(type, expr) ((type)(expr))
            |                                      ^~~~
      /home/masca/miniforge3/envs/sissa/include/python3.12/object.h:661:35: note: in expansion of macro ‘_PyObject_CAST’
        661 | #  define Py_INCREF(op) Py_INCREF(_PyObject_CAST(op))
            |                                   ^~~~~~~~~~~~~~
      dadapy/_cython/cython_clustering.c:1861:27: note: in expansion of macro ‘Py_INCREF’
       1861 |   #define __Pyx_INCREF(r) Py_INCREF(r)
            |                           ^~~~~~~~~
      dadapy/_cython/cython_clustering.c:4232:5: note: in expansion of macro ‘__Pyx_INCREF’
       4232 |     __Pyx_INCREF(((PyObject*)__pyx_v_d->subarray->shape));
            |     ^~~~~~~~~~~~
      dadapy/_cython/cython_clustering.c:4233:36: error: ‘PyArray_Descr’ {aka ‘struct _PyArray_Descr’} has no member named ‘subarray’
       4233 |     __pyx_r = ((PyObject*)__pyx_v_d->subarray->shape);
            |                                    ^~
      error: command '/usr/bin/gcc' failed with exit code 1
      [end of output]

  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for dadapy
Failed to build dadapy
ERROR: Could not build wheels for dadapy, which is required to install pyproject.toml-based projects
mascaretti commented 1 week ago

I have found a workaround: I am not sure all the passages are needed and not sure if it works on machine other than mine.

In order:

  1. Create a virtual environment (with virtualenvwrapper); updated pip and installed setuptools
  2. I have installed NumPy v1.26.4 (last before v2.0.0)
  3. I have cloned the repo (latest version)
  4. I have installed cython (from pip) and cythonised all the *.pyx in the Git/GitHub/DADApy/dadapy/_cython
  5. Then, we installed doing python setup.py build_ext --inplace and pip install . in the relevant directory.

This is the output of pip list:

pip list
Package         Version
--------------- -----------
contourpy       1.2.1
cycler          0.12.1
Cython          3.0.10
dadapy          0.3.0
fonttools       4.53.0
joblib          1.4.2
kiwisolver      1.4.5
matplotlib      3.9.0
numpy           1.26.4
packaging       24.1
pandas          2.2.2
pillow          10.3.0
pip             24.1
pyparsing       3.1.2
python-dateutil 2.9.0.post0
pytz            2024.1
scikit-learn    1.5.0
scipy           1.14.0
seaborn         0.13.2
setuptools      70.1.1
six             1.16.0
threadpoolctl   3.5.0
tzdata          2024.1

The version of Python is 3.12.4 and gcc is 14.1.1.

Installation using NumPy 2.0.0

I run into troubles while cythonising (and following exactly the same instructions as above): I do cython ./dadapy/_cython/*.pyx and obtain

ython dadapy/_cython/*.pyx
/home/masca/.envs/ddp2/lib/python3.12/site-packages/Cython/Compiler/Main.py:381: FutureWarning: Cython directive 'language_level' not set, using '3str' for now (Py3). This has changed from earlier releases! File: /home/masca/Git/GitHub/DADApy/dadapy/_cython/cython_clustering.pyx
  tree = Parsing.p_module(s, pxd, full_module_name)

Error compiling Cython file:
------------------------------------------------------------
...

DTYPE = np.int64
floatTYPE = np.float64

ctypedef np.int_t DTYPE_t
         ^
------------------------------------------------------------

dadapy/_cython/cython_clustering.pyx:16:9: 'int_t' is not a type identifier
/home/masca/.envs/ddp2/lib/python3.12/site-packages/Cython/Compiler/Main.py:381: FutureWarning: Cython directive 'language_level' not set, using '3str' for now (Py3). This has changed from earlier releases! File: /home/masca/Git/GitHub/DADApy/dadapy/_cython/cython_clustering_v2.pyx
  tree = Parsing.p_module(s, pxd, full_module_name)

Error compiling Cython file:
------------------------------------------------------------
...

DTYPE = np.int64
floatTYPE = np.float64

ctypedef np.int_t DTYPE_t
         ^
------------------------------------------------------------

dadapy/_cython/cython_clustering_v2.pyx:15:9: 'int_t' is not a type identifier
/home/masca/.envs/ddp2/lib/python3.12/site-packages/Cython/Compiler/Main.py:381: FutureWarning: Cython directive 'language_level' not set, using '3str' for now (Py3). This has changed from earlier releases! File: /home/masca/Git/GitHub/DADApy/dadapy/_cython/cython_density.pyx
  tree = Parsing.p_module(s, pxd, full_module_name)

Error compiling Cython file:
------------------------------------------------------------
...

DTYPE = np.int64
floatTYPE = np.float64

ctypedef np.int_t DTYPE_t
         ^
------------------------------------------------------------

dadapy/_cython/cython_density.pyx:16:9: 'int_t' is not a type identifier
/home/masca/.envs/ddp2/lib/python3.12/site-packages/Cython/Compiler/Main.py:381: FutureWarning: Cython directive 'language_level' not set, using '3str' for now (Py3). This has changed from earlier releases! File: /home/masca/Git/GitHub/DADApy/dadapy/_cython/cython_differentiable_imbalance.pyx
  tree = Parsing.p_module(s, pxd, full_module_name)
/home/masca/.envs/ddp2/lib/python3.12/site-packages/Cython/Compiler/Main.py:381: FutureWarning: Cython directive 'language_level' not set, using '3str' for now (Py3). This has changed from earlier releases! File: /home/masca/Git/GitHub/DADApy/dadapy/_cython/cython_distances.pyx
  tree = Parsing.p_module(s, pxd, full_module_name)

Error compiling Cython file:
------------------------------------------------------------
...

#DTYPE = int
#floatTYPE = np.float
#boolTYPE = np.bool

ctypedef np.int_t DTYPE_t
         ^
------------------------------------------------------------

dadapy/_cython/cython_distances.pyx:18:9: 'int_t' is not a type identifier
/home/masca/.envs/ddp2/lib/python3.12/site-packages/Cython/Compiler/Main.py:381: FutureWarning: Cython directive 'language_level' not set, using '3str' for now (Py3). This has changed from earlier releases! File: /home/masca/Git/GitHub/DADApy/dadapy/_cython/cython_grads.pyx
  tree = Parsing.p_module(s, pxd, full_module_name)

Error compiling Cython file:
------------------------------------------------------------
...

DTYPE = np.int_
floatTYPE = np.float_
boolTYPE = np.bool_

ctypedef np.int_t DTYPE_t
         ^
------------------------------------------------------------

dadapy/_cython/cython_grads.pyx:12:9: 'int_t' is not a type identifier
/home/masca/.envs/ddp2/lib/python3.12/site-packages/Cython/Compiler/Main.py:381: FutureWarning: Cython directive 'language_level' not set, using '3str' for now (Py3). This has changed from earlier releases! File: /home/masca/Git/GitHub/DADApy/dadapy/_cython/cython_maximum_likelihood_opt_full.pyx
  tree = Parsing.p_module(s, pxd, full_module_name)

Error compiling Cython file:
------------------------------------------------------------
...

DTYPE = np.int64
floatTYPE = np.float64

ctypedef np.int_t DTYPE_t
         ^
------------------------------------------------------------

dadapy/_cython/cython_maximum_likelihood_opt_full.pyx:13:9: 'int_t' is not a type identifier
/home/masca/.envs/ddp2/lib/python3.12/site-packages/Cython/Compiler/Main.py:381: FutureWarning: Cython directive 'language_level' not set, using '3str' for now (Py3). This has changed from earlier releases! File: /home/masca/Git/GitHub/DADApy/dadapy/_cython/cython_maximum_likelihood_opt.pyx
  tree = Parsing.p_module(s, pxd, full_module_name)

Error compiling Cython file:
------------------------------------------------------------
...

DTYPE = np.int64
floatTYPE = np.float64

ctypedef np.int_t DTYPE_t
         ^
------------------------------------------------------------

dadapy/_cython/cython_maximum_likelihood_opt.pyx:13:9: 'int_t' is not a type identifier
/home/masca/.envs/ddp2/lib/python3.12/site-packages/Cython/Compiler/Main.py:381: FutureWarning: Cython directive 'language_level' not set, using '3str' for now (Py3). This has changed from earlier releases! File: /home/masca/Git/GitHub/DADApy/dadapy/_cython/cython_overlap.pyx
  tree = Parsing.p_module(s, pxd, full_module_name)

Error compiling Cython file:
------------------------------------------------------------
...
cimport numpy as np

DTYPE = np.int64
floatTYPE = np.float64

ctypedef np.int_t DTYPE_t
         ^
------------------------------------------------------------

dadapy/_cython/cython_overlap.pyx:10:9: 'int_t' is not a type identifier
AldoGl commented 1 week ago

@mascaretti thank you very much for your feedback and your help in addressing the issue.

The problem was raised also in #131 and indeed it seems related to the new version of numpy. What I did not imagine is the fact that it all might simply be the np.int_t missing type identifier in the new numpy. Along with @imacocco we will address this further!

AldoGl commented 5 days ago

Hi @mascaretti, as already written in #131, we found out this was a compatibility problem of cython with numpy 2.0. Thanks to @diegodoimo change in 6736b147fd4a5585d924a7e3ea12c44beb4e2763 we released version 0.3.1 which now you should be able to install correctly