Open stuaxo opened 4 years ago
As a counterargument... I'm not quite sure whether libraries if the Xlib surface can work from this sort of WHL, though can't find any examples to test this.
Also not quite sure how manylinux works, but one potential problem is that the API surface of cairo depends on build time config and the version, so a pycairo built against one version might not load for another version.
We could work around that by doing every symbol lookup of things which we don't strictly depend on dynamically, so we built against a new/full cairo and throw errors in each method/constructor when linking against an older version. But hat's not something I've much experience with, but something I've had on my mind for some time now.
I did a bit more reading [1] and it's quite interesting, they choose an older version of CentOS with a version of GLIBC that will be compatible across different distros, (debian + derivatives, redhat, suse).
There is a command that audits the WHL to make sure it is compatible and puts the needed libraries into it - when I'm back in front of my own computer I will list those.
I need to do more testing, maybe on a headless docker image without Xlib, to see what breaks - I'm assuming that it will fail because Xlib is not present.
We could work around that by doing every symbol lookup of things which we don't strictly depend on dynamically,
How about splitting the pycairo build by backend ?
There would be a separate .so for each backend, that way if dependencies for any one backend are not present it wouldn't prevent you from loading another.
e.g. - If you tried to load XLibSurface and xlib is not available it would fail, but you could still have ImageSurace and friends.
[1] https://opensource.com/article/19/2/manylinux-python-wheels
I shared a dropbox folder with WHLs I've generated. This doesn't seem as useful as I thought, since it doesn't include cairo, or pixman.
Importing fails in my docker environment because I don't have X, so I'm going to see if I can build with a more minimal cairo.
Is there a way, when building pycairo I can do something like pass --no-x
to cairo when it's built ?
There would be a separate .so for each backend, that way if dependencies for any one backend are not present it wouldn't prevent you from loading another.
That would work I guess for the different backends, but not for different cairo versions which have a different set of symbols. Unless we build against an old cairo with some features disabled, but than I'm afraid of bug reports re missing features.
The only proper solution I see sadly is that we get rid of every toplevel ifdef and do all symbols lookups manually.
Good point, doing something like that at this end could get messy quickly.
Cairo itself could be split up into plugins, along the lines of:
libcairo2->libcairo2_pdf, libcairo2_png etc`
I need to get back to poking at cairo to see how feasible it is.
It's impossible to avoid any API breakage, but I wonder if when a backend wasnt installed it could resolve to a stub instead, so all the symbols would always be present.
In effect it would be as if cairo was compiled with everything enabled, but all the parts that actually do things would be split up so they could be packaged separately.
This is a little off topic for python bindings though.
You typoe'd my name, but fortunately I was looking at the tracker for other reasons and noticed this :)
This doesn't seem as useful as I thought, since it doesn't include cairo, or pixman.
Actually you need to run auditwheel repair
, what this does is that it detects the dependency on libcairo.so (and whatnot) and copies it into a .libs directory in the wheel and adjusts the RPATH of the pycairo shared object to point to it (RPATH is "basically" like LD_LIBRARY_PATH but just for a single shared object). So this should always use the version (in fact, the copy) of libcairo that the wheel was built against.
Thanks, mistyping things is my main superpower 😬
Does that mean that pycairo WHLs are perfectly valid ? The more I learn about shared objects, the more of a dark art it seems.
I think they should be (that's the intent of manylinux).
What happens in case pygobjects, which uses pycairo, also links against cairo?
It depends what python extension gets loaded first. If it's pycairo (including if pygobject imports pycairo before importing its own extension module) then the RPATH cairo is loaded and then also used by pygobject. (Basically only one libcairo.so will ever be loaded into the process, the first one wins -- but that was already the case before.) PS: This is all from my memory, no guarantees :)
ah, ok, thanks. In case of pygobject some of the libraries it loads, like gtk/gdk/pango/gstreamer link against cairo, which is why I'm asking. So depending on what kind of cairo gtk expects this might explode or not :)
Can we intervene in this process (perhaps by messing with RPATH?).
Ideally we would change the order, so that if external libcairo.so is available, use that - otherwise use the one in the WHL.
I don't know.
Are you up for adding your script to a pull request in .azure-pipelines
?
I'd like to have a play with the output, but work is so locked down I can't even access gist.github.com.
Nah, I'll let you handle the unfun work of actually making things work :) Your version is copied here :)
#!/bin/bash
# Written by Antony Lee (@anntzer).
set -e
PYTHON_VERSION=37
PIP="/opt/python/cp$PYTHON_VERSION-cp${PYTHON_VERSION}m/bin/pip"
if ! [[ -e "$PIP" ]]; then
# Not in the manylinux image yet: call self from within docker.
docker run -it \
--mount type=bind,source="$(readlink -f "$(dirname "$0")")",target=/io \
quay.io/pypa/manylinux1_x86_64 \
"/io/$(basename "$0")"
exit
fi
ZLIB_VERSION=1.2.11
LIBPNG_VERSION=1.6.37
FREETYPE_VERSION=2.10.0
EXPAT_VERSION=2.2.9
# FONTCONFIG_VERSION=2.13.1 # Requires one of: libxml2, expat (technically a dependency of Python too); gperf.
PIXMAN_VERSION=0.38.4
CAIRO_VERSION=1.16.0
PYCAIRO_VERSION=1.18.2
cd io
mkdir -p deps wheelhouse
yum install -y xz
download_unpack_cd () {
local url filename
url="$1"
filename="$(basename "$url")"
if ! [[ -e "$filename" ]]; then
curl -LOJ "$url"
fi
if [[ "$filename" =~ '\.tar\.gz' ]]; then
tar -xzf "$filename"
dirname="${filename::${#filename}-7}"
elif [[ "$filename" =~ '\.tar\.bz2' ]]; then
tar -xjf "$filename"
dirname="${filename::${#filename}-8}"
elif [[ "$filename" =~ '\.tar\.xz' ]]; then
xzcat "$filename" | tar -x
dirname="${filename::${#filename}-7}"
else
echo 'Unknown extension' >&2
exit 1
fi
cd "$dirname"
}
install_autoconf () {
local url flags dirname
url="$1"
flags="$2"
(
cd deps
download_unpack_cd "$url"
./configure $flags
make
make install
)
}
build_wheel () {
local url dest
url="$1"
dest="$2"
(
cd deps
download_unpack_cd "$url"
"$PIP" wheel --no-deps --wheel-dir "$dest" .
)
}
install_autoconf "https://zlib.net/zlib-$ZLIB_VERSION.tar.gz"
install_autoconf "https://download.sourceforge.net/libpng/libpng-$LIBPNG_VERSION.tar.gz"
install_autoconf "https://download.savannah.gnu.org/releases/freetype/freetype-$FREETYPE_VERSION.tar.gz"
install_autoconf "https://downloads.sourceforge.net/project/expat/expat/$EXPAT_VERSION/expat-$EXPAT_VERSION.tar.bz2"
# install_autoconf "https://www.freedesktop.org/software/fontconfig/release/fontconfig-$FONTCONFIG_VERSION.tar.gz"
install_autoconf "https://www.cairographics.org/releases/pixman-$PIXMAN_VERSION.tar.gz"
install_autoconf "https://www.cairographics.org/releases/cairo-$CAIRO_VERSION.tar.xz" --disable-gobject
build_wheel "https://github.com/pygobject/pycairo/releases/download/v$PYCAIRO_VERSION/pycairo-$PYCAIRO_VERSION.tar.gz" \
"$(readlink -f wheelhouse)"
auditwheel repair "wheelhouse/pycairo-$PYCAIRO_VERSION-cp$PYTHON_VERSION-cp${PYTHON_VERSION}m-linux_x86_64.whl"
Sounds good to me :) thanks for the pasting !
\@antzer@anntzer has a script to build manylinux WHLs for python https://gist.github.com/anntzer/a03230f94e6d111ba3abc737d5091b99I bumped the versions of everything and successfully built the wheels.
All the current tests seem to pass with the WHL on Ubuntu -
To do this - I used this slightly hacky procedure:
I wonder if it's worth extending this into something that could be used to build WHLs for release, or is it something we can should with the CI pipelines ?
The Manylinux project is here, I've not spent enough time looking at it yet to fully grasp the ins and outs https://github.com/pypa/manylinux