Open mikemhenry opened 1 month ago
NumPy changed its C API in 1.20 which broke binary compatibility, so we probably should specify >=1.20. We also should specify <2.0, since they're planning for another break in binary compatibility then.
Of course, this isn't really about OpenMM's requirements, just the requirements of a particular pip package compiled against a particular NumPy version. I wonder if that means it would be better to handle it with a patch in the recipe? Does PyPI have any mechanism for distinguishing between the requirements of the code and the requirements of a package compiled in a particular way?
(This is, of course, one of many ways in which Python dependency management is totally broken. It should be up to the library you're depending on to report the range of versions over which it maintains binary compatibility. It shouldn't be up to downstream packages to know that for every one of their dependencies.)
It should be up to the library you're depending on to report the range of versions over which it maintains binary compatibility
:100: Yes, this is something that conda build
can handle when but I am not sure the best way to handle this here. I do agree in a perfect world, at build time, we would inject the appropriate install_requires
version range(s) for numpy. This could get complex quite quickly but I think if we only need to worry about numpy
, I wouldn't be surpirsed if there is even a numpy function you can call that spits out the abi version ranges covered by whatever you have installed/just built against.
I will investigate the correct way to do this, perhaps for now we can put constraints on the "current" abi range while we work on a more robust solution.
So I will figure out the correct upper and lower versions to pin here and update the PR + build patch.
not sure if we have an upper or lower bound to consider here, but things JustWorked with a
pip install numpy
which pulled downnumpy==1.26.4