Closed ghost closed 1 year ago
Wheels are done for Windows and Linux. Waiting on OSX due to travis-ci/travis-ci#8552.
@xoviat, you have to be careful about how you build gmp/mpir when you want to distribute. Here's how I do it, https://github.com/symengine/symengine-wheels/blob/master/config.sh#L24. Note that you have to enable --enable-fat
or build for a generic x86 machine instead of ivybridge
in the travis machines.
Actually I just reused your build_dependencies
script to build it. Does that need modification?
Yes. You need to add --enable-fat
for Linux and add --host=x86_64-darwin-none
(not sure) on OSX (--enable-fat
for mpir doesn't work on Mac). Or you can switch to gmp instead of mpir can add --enable-fat
for Linux and Mac both.
Here is a wrapper project I made that builds wheels for python-flint
in case anyone needs those:
https://gitlab.com/alisianoi/flint-py
The README.md there describes the build process a bit. The wheels are on PyPI:
https://pypi.org/project/flint-py/
I bumped the version from 0.3.0
to 0.3.5
but all the changes were to the build process
@alisianoi Thanks for sharing! Would it make sense to integrate this with python-flint instead of having a separate project? It doesn't make much difference either way as long as it's maintained in some form.
I think it would definitely make sense to integrate this with python-flint. Setting this up to compile is not completely trivial so it would be much better if there are precompiled wheels for Windows, OSX and manylinux. They could be built in github actions.
@oscarbenjamin Indeed. Unfortunately I don't have much time as I'd like to maintain python-flint, and it'd be awesome if someone else was willing to step up. Not least considering all the new multivariate polynomial stuff (and more) in the latest Flint that hasn't been wrapped in python-flint yet.
Okay so maybe reopen this issue and I'll see what I can do.
I've almost managed to build a wheel on OSX (build script is below) but I'm hitting against two problems right now:
Although I can install with pip install .
and I can build a wheel with pip wheel .
when I try to install the wheel like pip install *.whl
I get this error:
```
$ pip install python_flint-0.3.0-cp38-cp38-macosx_11_0_x86_64.whl
ERROR: python_flint-0.3.0-cp38-cp38-macosx_11_0_x86_64.whl is not a supported wheel on this platform.
```
I think that's telling me that the name of the wheel specifies a system different from this one somehow (even though I built it here!). I've tried a few renames of the file but haven't managed to get pip to accept it.
2) I'll need to get delocate to be able to bundle flint/arb to make the wheel relocatable but that also doesn't seem to be working and I think it's because there is no package in python-flint (it's just a flint.* extension module). The delocate docs suggest that you need to make a package like:
```
flint/
__init__.py # this just does from .flint import *
flint.so
```
Then the shared libraries can be bundled into the package folder:
https://github.com/matthew-brett/delocate#external-library-is-not-bundled-into-wheel
The script I have so far is:
#!/usr/bin/env bash
set -o errexit
PYTHON=python3.8
FLINTVER=2.7.1
ARBVER=2.19.0
mkdir -p local
PREFIX=$(pwd)/local
curl -O https://www.flintlib.org/flint-$FLINTVER.tar.gz
curl -O -L https://github.com/fredrik-johansson/arb/archive/refs/tags/$ARBVER.tar.gz
mv $ARBVER.tar.gz arb-$ARBVER.tar.gz
git clone https://github.com/fredrik-johansson/python-flint
tar xf flint-$FLINTVER.tar.gz
cd flint-$FLINTVER
./configure --prefix=$PREFIX
make -j4
make install
cd ..
tar xf arb-$ARBVER.tar.gz
cd arb-$ARBVER
./configure --prefix=$PREFIX --with-flint=$PREFIX
make -j4
make install
cd ..
$PYTHON -m venv venv
source venv/bin/activate
pip install -U pip wheel
pip install numpy cython==0.27.3
# N.B. bugs in both older and newer Cython versions...
cd python-flint
pip wheel . --global-option=build_ext \
--global-option="-I$PREFIX/include" \
--global-option="-L$PREFIX/lib"
pip install . --global-option=build_ext \
--global-option="-I$PREFIX/include" \
--global-option="-L$PREFIX/lib"
cd ..
python -c 'import flint'
Okay I've got this working to make some usable OSX wheels. Changes needed in python_flint
are:
# Move src to src/flint
mv src flint
mkdir src
mv flint src
Apply this diff in setup.py:
diff --git a/setup.py b/setup.py
index 0a739c9..448466a 100644
--- a/setup.py
+++ b/setup.py
@@ -21,7 +21,7 @@ default_include_dirs += [
ext_modules = [
Extension(
- "flint", ["src/pyflint.pyx"],
+ "flint.flint", ["src/flint/pyflint.pyx"],
libraries=libraries,
library_dirs=default_lib_dirs,
include_dirs=default_include_dirs)
@@ -34,6 +34,8 @@ setup(
name='python-flint',
cmdclass={'build_ext': build_ext},
ext_modules=ext_modules,
+ packages=['flint'],
+ package_dir={'': 'src'},
description='Bindings for FLINT and Arb',
version='0.3.0',
url='https://github.com/python-flint/python-flint',
That gives the following package structure:
flint/
__init__.py
flint.so
Once those changes are made then this script generates a relocatable OSX wheel on OSX 11.1 with either Python 3.7 or 3.8 from python.org installers. It does not work with python 3.9. I think that is because a newer cython version is needed but the newer cython has a bug that doesn't work for python_flint
(#13)
EDIT (the changes above were made in #14 so this script works directly with master now):
#!/usr/bin/env bash
set -o errexit
PYTHON=python3.8
GMPVER=6.2.1
MPFRVER=4.1.0
FLINTVER=2.7.1
ARBVER=2.19.0
PYTHONFLINTVER=0.3.0
mkdir -p local
PREFIX=$(pwd)/local
curl -O https://gmplib.org/download/gmp/gmp-$GMPVER.tar.xz
curl -O https://www.mpfr.org/mpfr-current/mpfr-$MPFRVER.tar.gz
curl -O https://www.flintlib.org/flint-$FLINTVER.tar.gz
curl -O -L https://github.com/fredrik-johansson/arb/archive/refs/tags/$ARBVER.tar.gz
mv $ARBVER.tar.gz arb-$ARBVER.tar.gz
rm -rf python-flint
git clone https://github.com/fredrik-johansson/python-flint
tar xf gmp-$GMPVER.tar.xz
cd gmp-$GMPVER
./configure --prefix=$PREFIX --enable-fat --enable-shared --host=x86_64-apple-darwin
make -j3
make install
cd ..
tar -xf mpfr-$MPFRVER.tar.gz
cd mpfr-$MPFRVER
./configure --prefix=$PREFIX --with-gmp=$PREFIX
make -j3
make install
cd ..
tar xf flint-$FLINTVER.tar.gz
cd flint-$FLINTVER
./configure --prefix=$PREFIX --with-gmp=$PREFIX --with-mpfr=$PREFIX
make -j3
make install
cd ..
tar xf arb-$ARBVER.tar.gz
cd arb-$ARBVER
./configure --prefix=$PREFIX --with-flint=$PREFIX --with-gmp=$PREFIX --with-mpfr=$PREFIX
make -j3
make install
cd ..
$PYTHON -m venv venv
source venv/bin/activate
pip install -U pip wheel delocate
pip install numpy cython==0.27.3
# N.B. bugs in both older and newer Cython versions...
cd python-flint
pip wheel . --global-option=build_ext \
--global-option="-I$PREFIX/include" \
--global-option="-L$PREFIX/lib"
delocate-wheel -w .. *.whl
cd ..
tags="cp37-cp37m cp38-cp38"
osx_from="11_0"
osx_to="10_9"
for tag in $tags; do
wheelold=python_flint-$PYTHONFLINTVER-$tag-macosx_${osx_from}_x86_64.whl
wheelnew=python_flint-$PYTHONFLINTVER-$tag-macosx_${osx_to}_x86_64.whl
if [ -f $wheelold ]; then
mv $wheelold $wheelnew
fi
if [ -f $wheelnew ]; then
wheelfinal=$wheelnew
fi
done
echo ------------------------------------------
echo
echo Built wheel: $wheelfinal
echo
echo Link dependencies:
delocate-listdeps $wheelfinal
echo
pip install $wheelfinal
echo
echo Demonstration:
echo
python -c 'import flint; print("(3/2)**2 =", flint.fmpq(3, 2)**2)'
echo
echo Done!
echo
echo ------------------------------------------
The changes for python_flint
that I described above were made in #14 so the script above builds a wheel from master now.
It doesn't build GMP or MPFR but still works on my system because I have those installed. I guess I should get it to compile them as well then I could see how to get it working in Github Actions.
Ultimately python_flint
should probably use cibuildwheels
to build wheels on Windows, OSX and Linux but I'll see if I can get a single wheel build working directly first.
I've updated the script above so that it builds GMP and MPFR as well. I'm unsure about one thing which is that delocate reports the depedencies as:
Link dependencies:
@loader_path/.dylibs/libarb-2.10.0.dylib
@loader_path/.dylibs/libflint-15.dylib
@loader_path/libflint-15.dylib
@loader_path/libgmp.10.dylib
@loader_path/libmpfr.6.dylib
That seems to list libflint twice. I'm not sure if something is messed up there or how to fix it...
At the time of writing after merging #22 the situation is that wheels are built in CI for Linux and OSX using the latest releases of GMP, FMPR, Flint and Arb:
https://github.com/fredrik-johansson/python-flint/blob/4ec3a4b9abb6afb71b68e9c181e9f51681cc30f2/bin/build_variables.sh#L16-L21
I am trying to get a wheel built on Windows now using MSYS2 and the mingw-w64 tool chain (as is also used by numpy and gmpy2 for their PyPI wheels). As of now I've managed to build the dependencies and then build a wheel but after installing the wheel import flint
fails with missing DLL error:
$ python -c 'import flint'
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "C:\Users\pc\src\flint\python-flint\install_venv\lib\site-packages\flint\__init__.
py", line 1, in <module>
from ._flint import *
ImportError: DLL load failed while importing _flint: The specified module could not be fo
und.
Although this isn't yet working I though I'd record the steps I followed for now:
bin/build_dependencies.sh
script to build GMP, MPFR, Flint and Arb in .local
.libpython310.a
and place in the CPython libs
directory.setup.py
to build the wheel.pip
to install the wheel into a venvLonger explanation:
I think it's important that to use the official CPython binary releases from here: https://www.python.org/downloads/ PyPI wheels should be compatible with those builds of CPython rather than any potentially different binary in MSYS2, Cygwin etc. Specifically I installed Python 3.10.8: https://www.python.org/ftp/python/3.10.8/python-3.10.8-amd64.exe
I installed MSYS2 from here and specifically chose the msys2-x86_64-20221028.exe
installer:
https://repo.msys2.org/distrib/x86_64/
https://repo.msys2.org/distrib/x86_64/msys2-x86_64-20221028.exe
MSYS2 uses a rolling release model making it hard to pin precise versions of the toolchain for reproducibility so it is always recommended just to use the latest versions of everything. Alternatively I think that winlibs.com provides versioned binaries of mostly the same stuff that might make it easier to pin the toolchain but I haven't tried using that. After install you can run MSYS2 by choosing "MSYS2 MINGW64" from the start menu.
To install the mingw-w64 toolchain use the MSYS2 package manager:
$ pacman -S mingw-w64-x86_64-gcc m4 make
This will just install the latest versions of a bunch of packages and I don't think it's possible (or at least it's not recommended) to try to pin them.
One advantage of using MSYS2 rather than MSVC is that it provides a unix-like environment so that the build script should be more or less the same as for actual unix systems. I suspect that at least some adaptations would be needed to the bin/build_dependencies.sh
script to get it working (e.g. to provide mingw-w64 related build options) but for now at least I used the script without modification.
It takes about an hour to run on this computer which is a lot slower than in Linux (more like 5 minutes).
Although the script appears to complete successfully there are a lot of compiler warnings.
dllexport attribute ignored
. I'm not sure if those are an actual problem or not.For Flint and Arb there are many warnings about size mismatches like:
In file included from C:/Users/pc/src/flint/python-flint/.local/src/arb-2.23.0/acb_poly.h:22,
from compose.c:14:
In function 'acb_set',
inlined from '_acb_poly_compose_axnc' at compose.c:43:13:
C:/Users/pc/src/flint/python-flint/.local/src/arb-2.23.0/acb.h:131:5: warning: 'arb_set'
reading 48 bytes from a region of size 32 [-Wstringop-overread]
131 | arb_set(acb_realref(z), acb_realref(x));
| ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
C:/Users/pc/src/flint/python-flint/.local/src/arb-2.23.0/acb.h:131:5: note: referencing
argument 2 of type 'const arb_struct[1]'
In file included from C:/Users/pc/src/flint/python-flint/.local/src/arb-2.23.0/acb.h:23:
C:/Users/pc/src/flint/python-flint/.local/src/arb-2.23.0/arb.h: In function '_acb_poly_compose_axnc':
C:/Users/pc/src/flint/python-flint/.local/src/arb-2.23.0/arb.h:149:6: note: in a call to
function 'arb_set'
149 | void arb_set(arb_t x, const arb_t y);| ^~~~~~~
It's possible that there is some confusion over e.g. the size of fmpz perhaps related to long
or long long
being different in mingw-w64 or Windows or something but I haven't investigated.
Although build_dependencies.sh
completes without any "errors" I suspect that there are problems with the binaries that have been created and I'm also confused about the output files generated. The script installs into the .local
prefix and results in these files:
$ ls .local/include
acb.h acb_modular.h arb_mat.h double_interval.h mag.h
acb_calc.h acb_poly.h arb_poly.h flint mpf2mpfr.h
acb_dft.h arb.h arf.h fmpr.h mpfr.h
acb_dirichlet.h arb_calc.h bernoulli.h fmpz_extras.h partitions.h
acb_elliptic.h arb_fmpz_poly.h bool_mat.h fmpzi.h
acb_hypgeom.h arb_fpwrap.h dirichlet.h gmp.h
acb_mat.h arb_hypgeom.h dlog.h hypgeom.h
$ ls .local/lib
libarb-2.dll libarb.dll.2 libgmp.dll.a libmpfr.dll.a pkgconfig
libarb.dll libflint.dll.a libgmp.la libmpfr.la
$ ls .local/bin
libflint-17.dll libflint.dll.a libgmp-10.dll libmpfr-6.dll
So e.g. the lib directory contains libarb.dll
but not libflint.dll
and instead we have libflint-17.dll
but it has landed in the bin
directory. I'm not sure exactly what combination of .dll
, .dll.aand
.lafiles are needed to build the wheel or to be available at runtime and what directories they are supposed to be in. I have tried copying all of the DLL files into the installed
site-packages/flintdirectory and renaming some like
libflint-17.dll -> libflint.dlland I've tried adding them to
PATH` and other things but I still get the "missing DLL" error when importing flint. Frustratingly the error message doesn't say which DLL is failing to load.
To make libpython310.a
you need the migw-w64 gendef.exe so install that:
$ pacman -S mingw-w64-x86_64-tools-git
Then cd
into the directory of your Python 3.10 installation and create the file e.g.:
$ cd /c/Users/pc/AppData/Local/Programs/Python/Python310/
$ gendef python310.dll
$ dlltool --dllname python310.dll --def python310.def --output-lib libpython310.a
$ mv libpython310.a libs
Since we have the dependencies in .local
we need to tell distutils
to add those to the includ epaths etc and we also need to tell it to use the mingw-w64 compiler:
C_INCLUDE_PATH=.local/include/ LIBRARY_PATH=.local/lib/ .local/venv/Scripts/python setup.py build_ext -cmingw32 -f bdist_wheel
I created a venv in .local
to install the build dependencies. This needs numpy, Cython and wheel to be installed from pypi. The -cmingw32
does not mean that it is using mingw: this is still mingw-w64.
I changed setup.py to have libraries = ["arb", "flint", "gmp", "mpfr]
(i.e. remove pthreads
) and also include_dirs=default_include_dirs + ['.local/include']
This build step is a lot faster (about 1 minute) and also produces lots of warnings like
src/flint\pyflint.c:14960:101: note: expected '__pyx_t_5flint_6_flint_fmpz_struct *' {aka 'long int *'} but argument is of type 'fmpz *' {aka 'long long int *'}
14960 | static CYTHON_INLINE int __pyx_f_5flint_6_flint_fmpz_set_pylong(__pyx_t_5flint_6_flint_fmpz_struct *__pyx_v_x, PyObject *__pyx_v_obj) {
Again there seems to be some confusion about long
vs long long
. This is potentially to do with Windows having a 32 bit long but I'm not sure. I have seen mention in some places that maybe mingw-w64's long long
is incompatible with the msvc
one somehow.
After all of these steps there should be a wheel in the dist
subdirectory:
$ ls dist
python_flint-0.3.0-cp310-cp310-win_amd64.whl
As mentioned before though I found that after installing the wheel I couldn't import it because of the missing dll error.
6. there seems to be some confusion about
long
vslong long
This is discussed in #10
2. I'm also confused about the output files generated. The script installs into the
.local
prefix and results in these files:
$ ls .local/include
acb.h acb_modular.h arb_mat.h double_interval.h mag.h
acb_calc.h acb_poly.h arb_poly.h flint mpf2mpfr.h
acb_dft.h arb.h arf.h fmpr.h mpfr.h
acb_dirichlet.h arb_calc.h bernoulli.h fmpz_extras.h partitions.h
acb_elliptic.h arb_fmpz_poly.h bool_mat.h fmpzi.h
acb_hypgeom.h arb_fpwrap.h dirichlet.h gmp.h
acb_mat.h arb_hypgeom.h dlog.h hypgeom.h
$ ls .local/lib
libarb-2.dll libarb.dll.2 libgmp.dll.a libmpfr.dll.a pkgconfig
libarb.dll libflint.dll.a libgmp.la libmpfr.la
$ ls .local/bin
libflint-17.dll libflint.dll.a libgmp-10.dll libmpfr-6.dll
For comparison these are the files I get after running the build_dependencies.sh
script on Linux:
$ ls .local/include
acb_calc.h acb_mat.h arb.h bool_mat.h fmpz_extras.h mpfr.h
acb_dft.h acb_modular.h arb_hypgeom.h dirichlet.h fmpzi.h partitions.h
acb_dirichlet.h acb_poly.h arb_mat.h dlog.h gmp.h
acb_elliptic.h arb_calc.h arb_poly.h double_interval.h hypgeom.h
acb.h arb_fmpz_poly.h arf.h flint mag.h
acb_hypgeom.h arb_fpwrap.h bernoulli.h fmpr.h mpf2mpfr.h
$ ls .local/lib
libarb.so libflint.so libgmp.la libgmp.so.10.4.1 libmpfr.so.6
libarb.so.2 libflint.so.17 libgmp.so libmpfr.la libmpfr.so.6.1.0
libarb.so.2.14.0 libflint.so.17.0.0 libgmp.so.10 libmpfr.so pkgconfig
$ ls .local/bin
ls: cannot access '.local/bin': No such file or directory
If you pip install mingw_ldd
then you can query the dlls referred to by the _flint*.pyd
file:
$ cd build/lib.win-amd64-cpython-310/flint/
$ ls
__init__.py _flint.cp310-win_amd64.pyd
$ mingw-ldd _flint.cp310-win_amd64.pyd --dll-lookup-dirs .
kernel32.dll => not found
libarb-2.dll => not found
libflint-17.dll => not found
libgmp-10.dll => not found
msvcrt.dll => not found
python310.dll => not found
The kernel32.dll and msvcrt.dll files are in /c/Windows/System32
and will be found when needed. The python310.dll
is in the Python install and will also be found when needed.
The others libarb-2.dll
, libflint-17.dll
, libgmp-10.dll
need to be copied from .local/bin
and .local/lib
into the directory with _flint*.pyd
. Then mingw-ldd
will reveal more missing dlls:
$ mingw-ldd _flint.cp310-win_amd64.pyd --dll-lookup-dirs .
kernel32.dll => not found
libarb-2.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libarb-2.dll
libflint-17.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libflint-17.dll
libgmp-10.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libgmp-10.dll
libmpfr-6.dll => not found
libwinpthread-1.dll => not found
msvcrt.dll => not found
python310.dll => not found
Now libmpfr-6.dll
comes in:
$ mingw-ldd _flint.cp310-win_amd64.pyd --dll-lookup-dirs .
kernel32.dll => not found
libarb-2.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libarb-2.dll
libflint-17.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libflint-17.dll
libgcc_s_seh-1.dll => not found
libgmp-10.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libgmp-10.dll
libmpfr-6.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libmpfr-6.dll
libwinpthread-1.dll => not found
msvcrt.dll => not found
python310.dll => not found
The final two DLLs are libwinpthread-1.dll
and libgcc_s_seh-1.dll
which can both be found in /mingw64/bin
. Copying those in gives:
$ mingw-ldd _flint.cp310-win_amd64.pyd --dll-lookup-dirs .
kernel32.dll => not found
libarb-2.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libarb-2.dll
libflint-17.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libflint-17.dll
libgcc_s_seh-1.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libgcc_s_seh-1.dll
libgmp-10.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libgmp-10.dll
libmpfr-6.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libmpfr-6.dll
libwinpthread-1.dll => C:\Users\pc\src\flint\python-flint\build\lib.win-amd64-cpython-310\flint\libwinpthread-1.dll
msvcrt.dll => not found
python310.dll => not found
And so we're good for DLLs. Now let's try importing:
$ cd ..
(install_venv)
pc@NUCPC MINGW64 /c/Users/pc/src/flint/python-flint/build/lib.win-amd64-cpython-310
$ python
Python 3.10.8 (tags/v3.10.8:aaaf517, Oct 11 2022, 16:50:30) [MSC v.1933 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import flint
>>> flint.fmpz(2)
2
>>> flint.fmpz(2) ** 2
73786976329197944836
>>> flint.fmpz(2) + 2
-2079229367761764350
>>> flint.fmpz(2) + flint.fmpz(2)
-2079229359171829756
Okay so we've managed to load the DLLs and import the module. That's step 1 but obviously we need 2 + 2 = 4
... I guess that's #10 and the long
mismatch. Indeed if we run the tests:
$ python test/test.py
test_fmpz...Traceback (most recent call last):
File "C:\Users\pc\src\flint\python-flint\test\test.py", line 439, in <module>
sys.stdout.write("test_fmpz..."); test_fmpz(); print("OK")
File "C:\Users\pc\src\flint\python-flint\test\test.py", line 26, in test_fmpz
assert ltype(s) + rtype(t) == s + t
AssertionError
The cibuildwheel project does not do anything to "repair" Windows wheels by bundling DLLs (like using auditwheel or delocate for Linux or OSX wheels). There is a project called delvewheel
that can be used for this though. It can check DLL dependencies of a wheel and bundle them in:
$ pip install delvewheel
$ delvewheel show python_flint-0.3.0-cp310-cp310-win_amd64.whl --add-path .local/bin:.local/lib/
Analyzing python_flint-0.3.0-cp310-cp310-win_amd64.whl
The following DLLs will be copied into the wheel.
libflint-17.dll (C:\Users\pc\src\flint\python-flint\.local\bin\libflint-17.dll)
libgmp-10.dll (C:\Users\pc\src\flint\python-flint\.local\bin\libgmp-10.dll)
libmpfr-6.dll (C:\Users\pc\src\flint\python-flint\.local\bin\libmpfr-6.dll)
libarb-2.dll (C:\Users\pc\src\flint\python-flint\.local\lib\libarb-2.dll)
libgcc_s_seh-1.dll (C:\msys64\mingw64\bin\libgcc_s_seh-1.dll)
libwinpthread-1.dll (C:\msys64\mingw64\bin\libwinpthread-1.dll)
The following DLLs are assumed to be present in the end user's environment and will not be copied into the wheel.
kernel32.dll
msvcrt.dll
python310.dll
The repair
command fails because _flint*.pyd
has "trailing data". I'm not sure exactly what it means yet:
$ delvewheel repair python_flint-0.3.0-cp310-cp310-win_amd64.whl --add-path .local/bin:.local/lib/
repairing python_flint-0.3.0-cp310-cp310-win_amd64.whl
finding DLL dependencies
copying DLLs into python_flint.libs
mangling DLL names
Traceback (most recent call last):
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "C:\Users\pc\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 86, in _run_code
exec(code, run_globals)
File "C:\Users\pc\src\flint\python-flint\.local\venv\Scripts\delvewheel.exe\__main__.py", line 7, in <module>
File "C:\Users\pc\src\flint\python-flint\.local\venv\lib\site-packages\delvewheel\__main__.py", line 67, in main
wr.repair(args.target, no_mangles, args.no_mangle_all, args.lib_sdir)
File "C:\Users\pc\src\flint\python-flint\.local\venv\lib\site-packages\delvewheel\_wheel_repair.py", line 590, in repair
_dll_utils.replace_needed(extension_module_path, needed, name_mangler, self._verbose)
File "C:\Users\pc\src\flint\python-flint\.local\venv\lib\site-packages\delvewheel\_dll_utils.py", line 361, in replace_needed
raise RuntimeError(
RuntimeError: Unable to rename the dependencies of _flint.cp310-win_amd64.pyd because this DLL has trailing data. If this DLL was crea
ted with MinGW, run the strip utility. Otherwise, include libgmp-10.dll;libflint-17.dll;libarb-2.dll in the --no-mangle flag. In addit
ion, if you believe that delvewheel should avoid name-mangling a specific DLL by default, open an issue at https://github.com/adang134
5/delvewheel/issues and include this error message.
The --no-mangle
flag can make delvewheel
work and produce a wheel with all DLLs contained:
$ delvewheel repair python_flint-0.3.0-cp310-cp310-win_amd64.whl --add-path .local/bin:.local/lib/ '--no-mangle=libflint-17.dll;libarb-2.dll;libgmp-10.dll;libmpfr-6.dll;libgcc_s_seh-1.dll'
repairing python_flint-0.3.0-cp310-cp310-win_amd64.whl
finding DLL dependencies
copying DLLs into python_flint.libs
mangling DLL names
patching flint\__init__.py
updating python_flint-0.3.0.dist-info\RECORD
repackaging wheel
fixed wheel written to C:\Users\pc\src\flint\python-flint\wheelhouse\python_flint-0.3.0-cp310-cp310-win_amd64.whl
The delvewheel
README suggests that this is a bad idea though and says:
To avoid DLL hell, we mangle the file names of most DLLs that are vendored into the wheel. This way, a Python process that tries loading a vendored DLL does not end up using a different DLL with the same name. Due to a limitation in the machomachomangler dependency, delvewheel is unable to name-mangle DLLs whose dependents contain extra data at the end of the binary. If your DLL was created with MinGW, you can use the strip utility to remove the extra data. Otherwise, use the --no-mangle flag.
Here is a script that I've tested more or less end to end in a fresh msys2 install. This builds and tests a Windows wheel:
#!/bin/bash
#
# make_wheels.sh
#
# Build relocatable Windows wheel for python_flint using the mingw-w64
# toolchain in the MSYS2 enironment.
#
# - First install Python
#
# https://www.python.org/ftp/python/3.10.8/python-3.10.8-amd64.exe
#
# - Then checkout the code:
#
# $ git clone https://github.com/fredrik-johansson/python-flint/issues/1
#
# - Then install msys2
#
# https://repo.msys2.org/distrib/x86_64/msys2-x86_64-20221028.exe
#
# - Then open msys2, cd into the checked out repo. Change setup.py to say
#
# libraries = ["arb", "flint", "mpfr", "gmp"]
#
# - Then run this script.
set -o errexit
PYTHONDIR=/c/Users/pc/AppData/Local/Programs/Python/Python310/
WHEELNAME=python_flint-0.3.0-cp310-cp310-win_amd64.whl
# Install the mingw-w64 toolchain
pacman -S --noconfirm mingw-w64-x86_64-gcc m4 make mingw-w64-x86_64-tools-git
# This takes ~1hr
bin/build_dependencies_unix.sh
# Add the libpython310.a file to Python installation
cd $PYTHONDIR
gendef python310.dll
dlltool --dllname python310.dll --def python310.def --output-lib libpython310.a
mv libpython310.a libs
cd -
# Make a virtual environment to install the build dependencies
python -m venv .local/venv
source .local/venv/Scripts/activate
pip install numpy cython wheel delvewheel
# Build the wheel
C_INCLUDE_PATH=.local/include/ LIBRARY_PATH=.local/lib/ python setup.py build_ext -cmingw32 -f bdist_wheel
# Make the wheel relocatable
delvewheel repair dist/python_flint-0.3.0-cp310-cp310-win_amd64.whl \
--add-path .local/bin:.local/lib/ \
'--no-mangle=libflint-17.dll;libarb-2.dll;libgmp-10.dll;libmpfr-6.dll;libgcc_s_seh-1.dll'
# Make a virtual enironment to test the wheel
python -m venv test_env
source test_env/Scripts/activate
pip install wheelhouse/$WHEELNAME
python -c 'import flint; print(flint.fmpz(2) + 2)' # 2 + 2 = 4?
Of course the output is that 2+2
is -4153699946119299070
so issue #10 remains.
After #26 and #31:
2+2=4
now and the basic test suite passes.The outstanding items are:
The doctests fail on Windows (see fix: make test suite run doctests #31) so there is some bug somewhere.
I've opened #32 to track this.
- The doctests fail on Windows (see fix: make test suite run doctests #31) so there is some bug somewhere.
That's now fixed by #37. Now wheels are built in CI for Windows, OSX and Linux and all tests pass on all platforms although some parts of the codebase are not yet covered by the tests.
- The OSX wheels are intel only rather than universal2 so they cannot be used with newer ARM-based Macs
It looks like gmpy2 have wheels for Apple ARM but it seems that they need to patch GMP to make it work: https://github.com/aleaxit/gmpy/blob/9832ea59d76ffad7462b0ce9013a86e3e007435f/scripts/patch-arm64.diff#L2-L7 That changeset is dated 29th Nov 2020 and seems to be in the GMP repo but apparently there hasn't been a GMP release since 14th Nov 2020.
Also gmpy2 provides separate arm64 wheels rather than universal2 wheels which might be easier to do.
Also gmpy2 provides separate arm64 wheels rather than universal2 wheels which might be easier to do.
My experience is that separate arm64
and x86
wheels is indeed easier than universal2
for complicated wheels like this one. When I last checked, pip install
preferrers the single-architecture wheels even if the Python in question is a universal binary, so universal2
wheels are likely to rarely be used even if they are available.
universal2
wheels are likely to rarely be used even if they are available.
I was intending to produce only universal2
wheels in which case I think that they would always be used (unless I misunderstand). If it is easier as you say though to make separate wheels then it's probably better to go that path: I don't want to make this any more difficult than needed!
Right now I'm tempted to say let's postpone Apple ARM though and just put the other wheels on PyPI for now so that I can start working on downstream integration in parallel.
I was intending to produce only
universal2
wheels in which case I think that they would always be used (unless I misunderstand).
Yes, that's correct.
If it is easier as you say though to make separate wheels then it's probably better to go that path: I don't want to make this any more difficult than needed!
With the similar-sized package CyPari it did take some extra effort to build universal wheels, and with this package we have so far been unable to do so.
I'd recommend using Cirrus CI for macOS arm64 wheels. It provides free native arm64 runners, and it has been a nice experience using Cirrus CI for SciPy and the other projects that I've seen try it.
universal2 wheels are kinda pointless, thin wheels are much better to produce (and easier).
Thanks Ralf. Do you know why the SciPy Cirrus CI setup uses homebrew to build the wheels rather than one of the Cirrus CI Python images? Does that have any effect on binary compatibility?
I don't remember why that is - it shouldn't have effect on binary compatibility AFAIK. @andyfaff can you comment on the Homebrew usage?
I suppose the homebrew python is only used to run cibuildwheel which will then install its own versions of Python to do the actual wheel building so it shouldn't affect binary compatibility.
I guess the reason for using homebrew is that with Cirrus CI you have to choose which image to use and if you want to run on OSX arm64 you do:
macos_instance:
image: ghcr.io/cirruslabs/macos-monterey-xcode:13.3.1
https://cirrus-ci.org/guide/macOS/
The Cirrus CI recommended way to have Python ready is to do
container:
image: python:slim
https://cirrus-ci.org/examples/#building-pypi-packages
I don't understand exactly what all of these configuration options do. Maybe it's not possible to combine those and we're forced to choose between the two images in which case we can either choose to have Python installed or we can choose to use arm64 but then have to install everything ourselves (except homebrew which is preinstalled).
I suppose the homebrew python is only used to run cibuildwheel which will then install its own versions of Python to do the actual wheel building so it shouldn't affect binary compatibility.
That's correct. Homebrew python is only used to install cibuildwheel. When running, cibuildwheel takes control of installing all the pythons required for each wheel. For each python3.x wheel you build it downloads and installs the canonical cpython interpreter, which is then used to build the wheel. Binary compatibility is then with that python. Homebrew python is only used to bootstrap the process, it's not necessary to use a cirrus python image.
Cibuildwheel uses a similar process for windows. On Linux cibuildwheel uses the PyPA manylinux docker images to build wheels.
For scipy we use cirrus to build macosx_arm64 wheels and GitHub to build macosx_x86_64. The thin wheels are smaller in size. If absolutely required you might be able to direct cibuildwheel to make a fat binary on cirrus, macos has no issues cross compiling C code on an M1, but it would depend on how complex your build process is. We don't do this with scipy as we have to build fortran code as well, as soon as fortran compilers are involved things get messy
Thanks @andyfaff.
Unless I misunderstand it looks like it's straight-forward for cibuildwheel to build any of x86_64, arm64 and universal2 under OSX with either architecture as the build host. The tricky parts if not using purely native builds are:
If there was a way to get the artefact from GitHub Actions to Cirrus then I guess we would only need Cirrus just to test the wheels under M1.
Thanks all for your help.
I've opened #42 which adds Cirrus CI config to build OSX arm64 wheels. I already tested it in my fork including downloading the wheels and testing them locally.
Cirrus is also useful for building linux_aarch64 natively.
Is linux_aarch64
a significant platform? Does it get a lot of downloads or is there some other metric we could use?
I'm aware that there are some size limits in PyPI although I'm not sure exactly what they are. Currently with linux_x86_64, win_x86_64, macos_x86_64 and macos_arm64 and 3 versions of Python we're up to 150 megabytes of wheels for the next release of python_flint. The Linux wheels are the biggest weighing in at 30 megabytes each so I guess aarch64 would be another 100 megabytes.
Certainly for now at least I think aarch64 is out of scope for the next release. We're very close to having the platforms above sorted and then we should get a release of python_flint out ASAP.
The outstanding issues btw are:
Scipy has started making linux_aarch64 wheels. We were getting a few requests for them, I don't have the stats to hand. It's not an easy solution, the range of OS, processor, etc, is quite large.
I'm aware that there are some size limits in PyPI although I'm not sure exactly what they are. Currently with linux_x86_64, win_x86_64, macos_x86_64 and macos_arm64 and 3 versions of Python we're up to 150 megabytes of wheels for the next release of python_flint.
I believe the default limit on PyPI is 60 MB per wheel. I have uploaded single releases to PyPI where the 25+ wheels weighed in at over 300 MB without a hitch.
- Resolve the msvcrt.dll issue for the Windows wheels (Maybe it's fine as it is? The wheels seem to work...).
My experience is that essentially all Windows computers have msvcrt.dll
installed by some application or other. Moreover, your first run of delvewheel
says at the bottom that it will not fix the references to msvcrt.dll
because they are presumed to be on the user's system. So I think you're probably fine on this already.
My experience is that essentially all Windows computers have
msvcrt.dll
installed by some application or other.
I'm not concerned that the end users system might not have msvcrt.dll. I think it's pretty common and won't go away for a long time.
My concern is about mixing C runtimes within the same process. To be clear I am absolutely not any kind of expert on these things and I'll be very happy to accept that others know more than me. Let me explain my understanding so that I can be corrected if I've got it wrong...
Here I am building _flint.pyd which is a DLL file and I'm bundling it into a wheel with a bunch of other DLLs (libgmp.dll etc). The intended target for these is a user who is using the standard builds of CPython for Windows which are built using MSVC and link against the UCRT DLLs. When the user uses this module they will be using a CPython executable plus its own DLLs that are all linked against UCRT DLLs. Currently the DLLs in the Windows wheel here are linked against msvcrt.dll so import flint
means loading DLLs that are compiled against and will use msvcrt.dll. That means that multiple C runtimes will be loaded in the same process and might in some way conflict with one another.
My understanding is that it can be possible to mix C runtimes like this if it is done very carefully and there are various rules like not passing file handles or other things between the different parts that use different runtimes. I don't know what all the rules are for doing this and I also don't really have a good way to audit the 4 different libraries that are being bundled in order to see whether this is being done correctly. However it seems to work okay right now and the wheel works at runtime doesn't crash, passes all tests both locally and in CI etc.
I made a PR #41 which uses the msys2 UCRT toolchain to build all the dependencies. I've tested that part locally and in CI and it seems that the DLLs for GMP, Flint etc all link against UCRT as intended. However it seems like the _flint._pyd that is built by numpy.distutils using mingw64 somehow still links against msvcrt.dll. Those wheels build okay but then crash immediately at runtime. I think what that shows is that there are potential problems involved with mixing runtimes although the way it's done currently with the Windows wheels seems to work so far.
I'm not sure how exactly to tell numpy.distutils to use the UCRT toolchain for building _flint.dll (I think it is using it...) or how to tell it that msvcrt.dll should not be linked. One possible solution is to switch from using msys2 and the mingw64 toolchain when building _flint.pyd and use MSVC for that. I just gave that a try and it fails because MSVC can't find pthreads.h
. I guess I could build flint with pthreads disabled but I'm not sure that's worth it to fix something that isn't necessarily a problem. I also tried using vcpkg install pthreads:x64-windows
which seemed to install pthreads but I still got the same error: I'm not sure how to tell MSVC at the point of building _flint.dll
that it should use that pthreads library and make its headers available.
If you're only compiling c code and not fortran, then can't you just use the normal build chain and avoid mingw64.
To be fair I haven't actually tried to build GMP with MSVC. I was really just following the approach taken by gmpy since I know they've been working at this for longer.
The problem with MSVC is compiling GMP. This is also why gmpy uses mingw64 for building the C dependencies (https://github.com/aleaxit/gmpy/issues/352#issuecomment-1360941902).
The attraction of GMP is that it is really fast at integer arithmetic and much of what this collection of C libraries does is arranged around leveraging that speed by using fast integer multiplication to speed up everything else. GMP gets this speed in part by having a lot of hand-written assembly code for different CPUs:
https://gmplib.org/repo/gmp-6.2/file/tip/mpn
With MSVC we can't use --enable-fat
which is what is needed to bundle the code for all these architecture variations and have the right one selected at runtime: we would need to pin a lowest common denominator at build time.
There is a fork of GMP called MPIR which was intended to have better MSVC support among other things. I think that's what the current conda packages of python-flint for Windows use. MPIR is no longer maintained though so I think it's better to use GMP.
I've just merged https://github.com/fredrik-johansson/python-flint/pull/42 which builds and tests wheels for macos on arm64. That means that CI now builds and tests the following wheels:
I think that's good to go and it's time for a new release of python_flint so we can let people test these wheels out.
To be fair I haven't actually tried to build GMP with MSVC. I was really just following the approach taken by gmpy since I know they've been working at this for longer.
The problem with MSVC is compiling GMP. This is also why gmpy uses mingw64 for building the C dependencies (aleaxit/gmpy#352 (comment)). Oscar> The attraction of GMP is that it is really fast at integer arithmetic and much of what this collection of C libraries does is arranged around leveraging that speed by using fast integer multiplication to speed up everything else. GMP gets this speed in part by having a lot of hand-written assembly code for different CPUs: https://gmplib.org/repo/gmp-6.2/file/tip/mpn With MSVC we can't use
--enable-fat
which is what is needed to bundle the code for all these architecture variations and have the right one selected at runtime: we would need to pin a lowest common denominator at build time.There is a fork of GMP called MPIR which was intended to have better MSVC support among other things. I think that's what the current conda packages of python-flint for Windows use. MPIR is no longer maintained though so I think it's better to use GMP.
Hi Oscar, MPIR is still fully supported on WIndows x64 (by me) and I have a complete working build chain using Visual Studio for MPIR, MPFR, MPC, FLINT2 and ARB. I support GMPY2 and I would hope to support Python Flint on WIndows x64 using MPIR. If I can help in what you are doing please let me know.
Last night I pushed a 0.4.0 release of python-flint to PyPI with binary wheels for Windows x86-64, OSX x86-64 and arm64 and manylinux_2_17 x86-64. https://pypi.org/project/python-flint/#files
I was hoping I could close this issue but unfortunately I immediately discovered a problem with the Linux wheels when testing them locally:
$ pip install 0.4.0/python_flint-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Processing ./0.4.0/python_flint-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Installing collected packages: python-flint
Successfully installed python-flint-0.4.0
[notice] A new release of pip available: 22.3.1 -> 23.2.1
[notice] To update, run: pip install --upgrade pip
$ python -c 'import flint; flint.fmpz("1")'
Illegal instruction (core dumped)
I've deleted the Linux wheels from PyPI now so pip install python-flint
goes back to trying to build from the sdist on Linux. I am investigating the problem with the Linux wheels and will write a followup comment about these shortly.
In the mean time though pip install python-flint
should now install prebuilt binaries on Windows or OSX. I would appreciate if anyone could report testing these wheels which you can do like:
pip install python-flint
git clone https://github.com/fredrik-johansson/python-flint.git
python python-flint/test/test.py # takes about 1 second
python python-flint/test/dtest.py # takes a couple of minutes
I have tested locally on Windows and OSX x86-64 but I don't immediately have access to OSX arm46 (i.e. Apple M1 hardware) to test there. Theoretically all these wheels were tested in CI but as seen for the Linux wheels that is not enough to guarantee that they work outside of CI.
So the Linux wheels seem to be double badged as manylines_2_17
and manylinux2014
. I'm not sure how that happened. I can't upload an example here because the file is too large. This is the contents of the file:
$ wheel unpack python_flint-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Unpacking to: python_flint-0.4.0...OK
$ tree python_flint-0.4.0
python_flint-0.4.0
├── flint
│ ├── _flint.cpython-311-x86_64-linux-gnu.so
│ └── __init__.py
├── python_flint-0.4.0.dist-info
│ ├── LICENSE
│ ├── METADATA
│ ├── RECORD
│ ├── top_level.txt
│ └── WHEEL
└── python_flint.libs
├── libarb-4c2cc9a4.so.2.14.0
├── libflint-c1ff2639.so.17.0.0
├── libgmp-bfc87f94.so.10.5.0
└── libmpfr-90ec1309.so.6.1.0
3 directories, 11 files
It looks like auditwheel has bundled the right .so files for the immediate dependencies but has not bundled anything else. I am not sure if that should be considered correct or not.
I guess we can check with auditwheel:
$ auditwheel show python_flint-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
python_flint-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
is consistent with the following platform tag:
"manylinux_2_17_x86_64".
The wheel references external versioned symbols in these
system-provided shared libraries: libc.so.6 with versions
{'GLIBC_2.3', 'GLIBC_2.7', 'GLIBC_2.14', 'GLIBC_2.2.5'},
libpthread.so.0 with versions {'GLIBC_2.3.2', 'GLIBC_2.2.5',
'GLIBC_2.3.4'}, libm.so.6 with versions {'GLIBC_2.2.5'}
This constrains the platform tag to "manylinux_2_17_x86_64". In order
to achieve a more compatible tag, you would need to recompile a new
wheel from source on a system with earlier versions of these
libraries, such as a recent manylinux image.
In CI when building the wheels auditwheel says:
Repairing wheel...
+ sh -c 'auditwheel repair -w /tmp/cibuildwheel/repaired_wheel /tmp/cibuildwheel/built_wheel/python_flint-0.4.0-cp311-cp311-linux_x86_64.whl'
INFO:auditwheel.main_repair:Repairing python_flint-0.4.0-cp311-cp311-linux_x86_64.whl
INFO:auditwheel.wheeltools:Previous filename tags: linux_x86_64
INFO:auditwheel.wheeltools:New filename tags: manylinux_2_17_x86_64, manylinux2014_x86_64
INFO:auditwheel.wheeltools:Previous WHEEL info tags: cp311-cp311-linux_x86_64
INFO:auditwheel.wheeltools:New WHEEL info tags: cp311-cp311-manylinux_2_17_x86_64, cp311-cp311-manylinux2014_x86_64
INFO:auditwheel.main_repair:
Fixed-up wheel written to /tmp/cibuildwheel/repaired_wheel/python_flint-0.4.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
+ /opt/python/cp38-cp38/bin/python -c 'import sys, json, glob; json.dump(glob.glob('"'"'/tmp/cibuildwheel/repaired_wheel/*.whl'"'"'), sys.stdout)'
The configuration for this is here: https://github.com/fredrik-johansson/python-flint/blob/bde3dbdb37766c5d9d66184b886611f33d15578f/.github/workflows/buildwheel.yml#L34-L54 So we are asking for the manylinux2014 image but somehow getting something with manylinux_2_17.
I think that I have previously seen some issue with downloading a Linux wheel from CI and that possibly at the time I "fixed" it by using a different manylinux image: at least that is the first thing I can try.
It seems that NumPy's wheels are also double badged and auditwheel reports a similar result:
$ auditwheel show numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
numpy-1.25.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
is consistent with the following platform tag:
"manylinux_2_17_x86_64".
The wheel references external versioned symbols in these
system-provided shared libraries: libm.so.6 with versions
{'GLIBC_2.2.5'}, libgcc_s.so.1 with versions {'GCC_4.2.0',
'GCC_4.3.0', 'GCC_3.0', 'GCC_3.4', 'GCC_3.3', 'GCC_4.8.0'},
libquadmath-96973f99.so.0.0.0 with versions {'QUADMATH_1.0'},
libc.so.6 with versions {'GLIBC_2.14', 'GLIBC_2.4', 'GLIBC_2.17',
'GLIBC_2.3', 'GLIBC_2.3.4', 'GLIBC_2.6', 'GLIBC_2.3.2', 'GLIBC_2.2.5',
'GLIBC_2.10', 'GLIBC_2.7'}, libpthread.so.0 with versions
{'GLIBC_2.3.2', 'GLIBC_2.3.4', 'GLIBC_2.2.5'},
libgfortran-040039e1.so.5.0.0 with versions {'GFORTRAN_8'}
This constrains the platform tag to "manylinux_2_17_x86_64". In order
to achieve a more compatible tag, you would need to recompile a new
wheel from source on a system with earlier versions of these
libraries, such as a recent manylinux image.
Also the contents of the wheel seem similar at least for what is bundled:
└── numpy.libs
├── libgfortran-040039e1.so.5.0.0
├── libopenblas64_p-r0-5007b62f.3.23.dev.so
└── libquadmath-96973f99.so.0.0.0
I think that I have previously seen some issue with downloading a Linux wheel from CI and that possibly at the time I "fixed" it by using a different manylinux image: at least that is the first thing I can try.
I've tried manylinux2010 (#46) and manylinux_2_24 (#47) but in all cases I get the same crash.
The problem is specifically in string conversion e.g. converting an fmpz to a decimal string so my guess is that that codepath goes through to libm or libc somehow and then some ABI compatibility breaks. I don't know whether any of GMP, MPFR, Flint or Arb is using libm or libc in a non-standard way somehow that manylinux/auditwheel just can't account for when attempting to produce a relocatable wheel.
Converting to a hex string works for sufficiently small fmpz
:
>>> import flint
>>> hex(flint.fmpz(2))
'0x2'
>>> hex(flint.fmpz(2<<2))
'0x8'
>>> hex(flint.fmpz(2<<61))
'0x0'
>>> hex(flint.fmpz(2<<60))
'0x2000000000000000'
>>> hex(flint.fmpz(2<<62))
Illegal instruction (core dumped)
Worth noting that the internal representation for fmpz
changes at 2**62
from being a native C long (unsigned?) to being a pointer to a GMP mpz. So somehow at that point the hex conversion fails. Converting to a decimal string fails at much smaller sizes:
>>> import flint
>>> str(flint.fmpz(0))
'0'
>>> str(flint.fmpz(1))
Illegal instruction (core dumped)
Conversion is done using fmpz_get_str
:
https://github.com/fredrik-johansson/python-flint/blob/bde3dbdb37766c5d9d66184b886611f33d15578f/src/flint/fmpz.pyx#L166-L184
For Flint 2.9.0 (which we use here) that call goes straight through to gmp:
https://github.com/flintlib/flint2/blob/e143df4b0f19d2f841e36234a12b69f48c4359b9/fmpz/get_str.c#L17
That code seems to be completely rewritten without GMP in Flint 3.0.0-alpha1:
https://github.com/flintlib/flint2/blob/0447e52c21d559b0b581ac6ed9f3e8b2380830bb/src/fmpz/get_str.c#L237
I have a branch that already uses flint-3.0.0-alpha1 in gh-43! Downloading those wheels on Linux also fails though but just not in exactly the same way. It works up to 2**62-1
but crashes after that:
(Pdb) p str(flint.fmpz(2**62-1)) == str(2**62-1)
True
(Pdb) p str(flint.fmpz(2**62)) == str(2**62)
Illegal instruction (core dumped)
Even if using flint-3.0.0-alpha1 did work that is really just working around the ABI issue. Somehow cibuildwheel/manylinux/auditwheel are not doing their job properly because the wheel is not relocatable.
I would appreciate if anyone could report testing these wheels which you can do like:
All tests passed on macOS 13 (Ventura) with Python 3.9, 3.10, 3.11 (downloaded from Python.org) on an iMacPro 2017. All tests passed on Windows 10 Pro with Python 3.9 and 3.11 (downloaded from Python.org) on a Surface Pro 2017 (i5-7300U). It might be worth adding the test files to the wheels inside a submodule so that one could do something like:
python -m pip install python-flint
python -m flint.test
to run all the tests without cloning the source repo.
Thanks @NathanDunfield!
on an iMacPro 2017
Am I right in presuming that would be x86-64 (i.e. not the M1 CPU)?
It might be worth adding the test files to the wheels inside a submodule
Yes, that is a good idea. I've included them in the sdist but they are not in the wheels. The wheels are already quite large because of the bundled binaries so the test files would not noticeably increase the size.
Am I right in presuming that would be x86-64 (i.e. not the M1 CPU)?
Yes, it's an Intel processor (Xeon W-2150B), not Apple Silicon.
I've just pushed a 0.4.1 release to PyPI.
The issue with Linux wheels remains unresolved: I have no more time to work on this and it is unclear to me what to do investigate this further in any case. Unless anyone has any bright ideas I just need to defer this problem.
Instead of fixing the Linux wheels issue I have pushed updated instructions for building from sdist on Linux: https://github.com/fredrik-johansson/python-flint#installation
On the bright side the Windows and OSX wheels seem to work very well. Thanks to @NathanDunfield for testing on Windows and OSX x86-64 for a range of Python versions. I have now also tested on OSX arm64 and that seems to work great as well.
Maybe this issue should be closed and replaced with a Linux-specific wheel issue.
FYI I'm working on uploading wheels to pypi over at https://github.com/xoviat/build-python-flint. After some pain, I've managed to build python-flint on windows and will probably have builds for OSX and manylinux as well.