Closed letmaik closed 3 years ago
No, sorry, delocate doesn't do that at the moment.
If you do delocate-wheel
or delocate-path
and then check libraries output into .dylibs
(by default), then you see the dependencies, but it would be good if listdeps
showed you, I agree.
So, are you saying that delocate-wheel
correctly pulls in recursive dependencies and it's just delocate-listdeps
which doesn't display it? I haven't tested it on a mac myself yet, just trough Travis.
Yes, exactly, delocate will pull in all the dependencies-of-dependencies except the ones in the system dirs /System and /usr/lib (by default).
Sorry - I should also say that delocate-listdeps --all
will do what you want.
EDIT - scratch that - I was misreading my own docs.
I have a closely related strange behavior in delocate. My python package includes a (pybind11 glued) library that depends on libboost-graph that depends on libicu (including libicui18n.64.2.dylib). libboost-graph and libicui18n.64.2.dylib are correctly added to the .dylib folder, but libicui18n.64.2.dylib depends on libicudata.64.dylib and libicuuc.64.dylib and they are not included in the .dylib folder. Unsurprisingly, importing the python package leads to an import error, indicating the @loader_path/libicuuc.64.dylib, referenced from libicui18n.64.2.dylib, cannot be loaded. Is there a limit to the recursion of delocate?
I don't know of a limit. It sounds like we need to investigate ...
Actually, after reading delocating and libsana, I think the issue comes from the fact that delocate does not handle @loader_path dependencies. In my case, libicui18n.64.2.dylib depends on @loader_path/libicudata.64.dylib and @loader_path/ libicuuc.64.dylib. These are not copied (or renamed). Indeed, their realnames are actually /libicudata.64.1.dylib and libicuuc.64.1.dylib which have been copied (because of other dependecies).
I would say that when a library is copied that has original @loader_path dependencies, if these dependencies' realnames have been copied, then the install name they are referred by should be changed to the new local .dylib name. Otherwise, the library should be copied too.
Actually, it looks like HexDecimal PR above could solve my issue.
Well, just tried and.. no. It doesn't. The copied libicui18n.64.2.dylib still depends on @loader_path/libicudata.64.dylib and @loader_path/ libicuuc.64.dylib which are copied under their realnames, not this @loader_path that refers to a symbolic link.
It's not a depth thing. I have a library which is a dependency of the main C++ library which isn't being copied. That said, I am using @loader_path
as well, but without better tracing logic (say, a -v
flag), I'm left with debugging what is going on and why it gets missed.
Thanks Ben. In the mean time, I have spent some time on this and I have been able to get a working PyPi package that uses boost and its dependencies.My issue is entirely caused by the fact that Boost depends on a library (libicu) that has several sublibraries that depends on each other using @loader_path (libicui18N, libicudata libicuuc,...). The libraries real names include the full version number (64.2.1) while the @loader_path dependencies include a more general dependency on version 64.2 (which is reached through a symlink to the real library).
So, sub-library libicui18N is copied, it depends on @loader_path/libicudata 64.2. libicudata is copied under its real name (libicudata64.2.1) because of another dependency (lucky me) but is not found on loading because of the version mismatch.
So, the rule that delocate needs to enforce is that "when a library A that has @loader_path dependencies is copied, these @loader_path dependencies must be copied with their realname and the copied library A @loader_path dependency must be updated to this real name.
I now run the following bash script to fix the delocated wheel. Under Travis, all my delocated wheels end up in a "wheelhouse" folder and the script should get the folder name as an argument. This could be useful (after suitable changes) to others maybe and I'm confident someone can come with a more general version that scans all @loader_path dependencies and updates the install_names with realnames as needed :-)
#!\bin/bash
pkgname=pypackage
cd $1
for wheel in *.whl
do
mkdir $wheel.tmp
cd $wheel.tmp
unzip ../$wheel
cd $pkgname/.dylibs
libicuuc=`ls libicuuc*.dylib`
libicui18n=`ls libicui18n*.dylib`
libicudata=`ls libicudata*.dylib`
slibicuuc=${libicuuc%.*}
slibicuuc=${slibicuuc%.*}.dylib
slibicui18n=${libicui18n%.*}
slibicui18n=${slibicui18n%.*}.dylib
slibicudata=${libicudata%.*}
slibicudata=${slibicudata%.*}.dylib
chmod u+w $libicuuc
chmod u+w $libicui18n
install_name_tool -change "@loader_path/$slibicudata" "@loader_path/$libicudata" $libicuuc
install_name_tool -change "@loader_path/$slibicudata" "@loader_path/$libicudata" $libicui18n
install_name_tool -change "@loader_path/$slibicuuc" "@loader_path/$libicuuc" $libicui18n
cd ../..
rm ../$wheel
zip -r ../$wheel *
cd ..
rm -rf $wheel.tmp
done
My solution has been to just skip delocate
altogether and just build the libraries that need to go into the wheel directly into the .dylibs
directory. I'm curious whether the project I'm working on is just more complicated than anyone has ever tried (in this case, it's just a bunch of libraries that need packaged) or people have been manually fixing things up for the past $many years and living with it.
<semi-rant>
Python deployment for projects that aren't pure Python seems really sad even today given the patchwork solutions without endorsement nor official documentation of what the target layout is. Is it just "what delocate
does"? Or is there some documentation for how this stuff is supposed to work?
I should clarify that my frustration is more around the Python packaging ecosystem than with delocate
itself. delocate
is trying its best, but when you've got people wondering "where are our macOS wheels?" on one side and "no one seems to have figured out how to ship compiled libraries in wheels[1] reliably" on the other, the lack of "official" tooling or documentation is quite frustrating.
[1] Including, but not limited to:
I can only agree with you. This is my first wheel ever and this has been a pain to set up. Linux was already a pain because documentation on wheeling binary Python packages and modules is lacking. I could only assemble pieces from various existing packages github repos (scikit-network has been very useful). Compared to MacOS, the Linux variant was far easier to build thanks to the manylinux docker images and the auditwheel package. But MacOS wheeling is in dire need of more tools. delocate is a great step in the right direction, but there are still additional steps to take I think.
My superficial understanding of delocate's code is that there probably will be no issues with "duplicate libraries" except for the duplicate space. delocate uses @loader-path with the realname of the library which is copied, so the copy which is in the .dylibs directory will be used and no other.
Linking against the libraries of other packages would great but harder to manage. We are in a world of abundant RAM/disk space and duplication is easier to manage (I hate to write this, but it's how it is).
Linking against the libraries of other packages would great but harder to manage. We are in a world of abundant RAM/disk space and duplication is easier to manage (I hate to write this, but it's how it is).
That may work, but when you have global registries, there needs to be a single library that everyone agrees on using so that they can interoperate (basically, singletons generally exist per-library, so libsingleton being copied into each wheel means there are actual N copies of the "singleton"; which one gets seen depends on which wheel's library "asks" for it). For example, take Boost.Python's type registry. If it is loaded per-wheel, wheel A's inner copy won't "see" wheel B's types. If B depends on A, this is probably not going to allow things to interoperate well.
I'm writing a Python extension for a native library. delocate-listdeps lists the native library, but it doesn't list the dependencies of the native libary which would have to be included as well in the wheel. How should I do that? Is that supported already?