Because PyBaMM comes with a python C++ extension, the corresponding wheel distribution contains
compiled code.
We should make PyBaMM wheels compliant with the manylinux policy.
This issue is addressed by PR #897
Distributing compiled binaries
When distributing compiled code, one has to be careful that the target system has all the libraries that the compiled executable (here the idaklu extension) links against.
This is especially a problem for Linux since Linux distributions do not provide the same libraries and versions through their package managers.
More importantly, all compiled C/C++ extensions will have a runtime dependency on the GNU C Library glibc, which version will depend on the system the extension was compiled on.
For instance, if you compile your extension using Ubuntu 19.04, it will be liked against a fairly recent glibc version, that is not available on, say, centOS6.
The manylinux policy
To be able to distribute portable wheels that work on many linux distributions and versions, PEP 513 introduced the manylinux policy, i.e. a small set of libraries and maximum corresponding versions that can be expected to be available on many linux systems. The key point is that all these libraries maintain backward compatibility, so that any code linked against an older glibc will be able to use a more recent one.
In practice, this means compiling/linking your C extension against old versions of these libraries.
To help doing this, the manylinux policy comes with a docker image.
By compiling the python extension inside this image, you make sure your C extension is linked against old enough versions of these
libraries, and in particular glibc, for maximum portability across many linux systems!
At the moment, there are 3 manylinux policies each one allowing for a more recent set of versions.
My suggestion is that we use the latest one, manylinux2014, see [https://www.python.org/dev/peps/pep-0599/](PEP 599 - manylinux2014), that should garantuee
compatibility with linux distros as old as 2014.
In PR #wdwd I introduce a small script build-wheel.sh that builds the wheel inside the manylinux docker image:
# in PyBaMM/
docker run -v `pwd`:/io quay.io/pypa/manylinux2014_x86_64 /io/build-wheel.sh
What about SuiteSparse and Sundials?
These two libraries are of course not in the set of available in the manylinux docker image!
The manylinux policy comes with with a nice tool called https://github.com/pypa/auditwheel that can be used
to check the compliance of a wheel with a given manylinux policy.
More importantly, auditwheel can repair the wheels, that is detect all external libraries and bundle them in the wheel!
Because PyBaMM comes with a python C++ extension, the corresponding wheel distribution contains compiled code. We should make PyBaMM wheels compliant with the manylinux policy. This issue is addressed by PR #897
Distributing compiled binaries
When distributing compiled code, one has to be careful that the target system has all the libraries that the compiled executable (here the idaklu extension) links against. This is especially a problem for Linux since Linux distributions do not provide the same libraries and versions through their package managers. More importantly, all compiled C/C++ extensions will have a runtime dependency on the GNU C Library
glibc
, which version will depend on the system the extension was compiled on. For instance, if you compile your extension using Ubuntu 19.04, it will be liked against a fairly recent glibc version, that is not available on, say, centOS6.The manylinux policy
To be able to distribute portable wheels that work on many linux distributions and versions, PEP 513 introduced the manylinux policy, i.e. a small set of libraries and maximum corresponding versions that can be expected to be available on many linux systems. The key point is that all these libraries maintain backward compatibility, so that any code linked against an older
glibc
will be able to use a more recent one.In practice, this means compiling/linking your C extension against old versions of these libraries. To help doing this, the manylinux policy comes with a docker image. By compiling the python extension inside this image, you make sure your C extension is linked against old enough versions of these libraries, and in particular
glibc
, for maximum portability across many linux systems! At the moment, there are 3 manylinux policies each one allowing for a more recent set of versions.My suggestion is that we use the latest one,
manylinux2014
, see [https://www.python.org/dev/peps/pep-0599/](PEP 599 - manylinux2014), that should garantuee compatibility with linux distros as old as 2014.In PR #wdwd I introduce a small script
build-wheel.sh
that builds the wheel inside the manylinux docker image:What about SuiteSparse and Sundials?
These two libraries are of course not in the set of available in the manylinux docker image! The manylinux policy comes with with a nice tool called https://github.com/pypa/auditwheel that can be used to check the compliance of a wheel with a given manylinux policy. More importantly,
auditwheel
can repair the wheels, that is detect all external libraries and bundle them in the wheel!Example
Say we have the following directory strucutre:
Let's build the wheel and have a look at the runtine deps of the
idaklu
module:Which means that on another system, this dependencies will only be satisfied if the libraries are installed in the same location and the same version.
Using
auditwheel
, it is possible to bundle these external deps into the wheel:Looking inside the repaired wheel, we see that it is contains a directory
.libidaklu
that contains the runtime dependencies:This means that a user installing the wheel, for instance via
does not need to have suitesparse or sundials installed!