Closed joseph-henry closed 3 years ago
Below is a build output showing the last wheel failing. After inspecting the process and the resultant .so
files from previous successful builds, I suspect the "successful" builds in this case are actually devoid of any of the symbols from my C/C++ object files hence their success (can't have a conflict if there's no symbols, right?). I think this is because it's copying the wrong .so
file into the final wheel. I can produce .so
files with the correct symbols, but then they fail the repair process.
In short, I'm misusing this tool but I'm not sure how.
For instance:
00001f08 d _DYNAMIC
00002000 d _GLOBAL_OFFSET_TABLE_
w _ITM_deregisterTMCloneTable
w _ITM_registerTMCloneTable
00000478 r __FRAME_END__
00002014 d __TMC_END__
00002014 B __bss_start
w __cxa_finalize@@GLIBC_2.1.3
000003f0 t __do_global_dtors_aux
00001f00 d __do_global_dtors_aux_fini_array_entry
00001f04 d __dso_handle
00001efc d __frame_dummy_init_array_entry
w __gmon_start__
00000451 t __x86.get_pc_thunk.bx
00000445 t __x86.get_pc_thunk.dx
00002014 D _edata
00002018 B _end
00000460 T _fini
00000308 T _init
00002014 b completed.6760
00000360 t deregister_tm_clones
00000440 t frame_dummy
000003a0 t register_tm_clones
Building cp35-manylinux_x86_64 wheel
CPython 3.5 manylinux x86_64
Setting up build environment...
✓ 0.12s
Building wheel...
✓ 2.55s
Repairing wheel...
✓ 0.78s
✓ cp35-manylinux_x86_64 finished in 3.57s
Building cp36-manylinux_x86_64 wheel
CPython 3.6 manylinux x86_64
Setting up build environment...
✓ 0.04s
Building wheel...
✓ 2.54s
Repairing wheel...
✓ 0.30s
✓ cp36-manylinux_x86_64 finished in 3.01s
Building cp37-manylinux_x86_64 wheel
CPython 3.7 manylinux x86_64
Setting up build environment...
✓ 0.04s
Building wheel...
✓ 2.16s
Repairing wheel...
✓ 0.31s
✓ cp37-manylinux_x86_64 finished in 2.63s
Building cp38-manylinux_x86_64 wheel
CPython 3.8 manylinux x86_64
Setting up build environment...
✓ 0.04s
Building wheel...
✓ 2.47s
Repairing wheel...
✓ 0.31s
✓ cp38-manylinux_x86_64 finished in 2.94s
Building cp39-manylinux_x86_64 wheel
CPython 3.9 manylinux x86_64
Setting up build environment...
✓ 0.04s
Building wheel...
✓ 3.70s
Repairing wheel...
✕ 2.50s
Error: Process completed with exit code 1.
Could you check if you have any so files inside the repository? Maybe, by mistake, you push python 3.8 specific one? What is your python developer version?
Thanks for the reply.
Could you check if you have any so files inside the repository? Maybe, by mistake, you push python 3.8 specific one?
I suspected this and checked, nope.
What is your python developer version?
Do you mean the version installed during setup?
Run actions/setup-python@v2
Successfully setup CPython (3.9.2)
Just re-ran it: auditwheel: error: cannot repair "/tmp/cibuildwheel/built_wheel/libtest-1.3.4a1-cp39-cp39-linux_x86_64.whl" to "manylinux2014_x86_64"
Seems like it should work?
No. I mean version installed on your local developer machine. Did your repository is public?
I think that problem comes from the line: python setup.py build_clib --verbose build_ext -i --verbose
I see. My development machines that have successfully built this module and wheel are 3.8 and 3.9
By removing python setup.py build_clib --verbose build_ext -i --verbose
the wheels do build. However they contain empty .so
files since the native C/C++ portion of the module isn't built.
How should I properly trigger the building and linking of the native portion of my module before the wheel is built?
That must happen inside the cibuildwheel step. (also, this really should be triggered by the normal commands, pip install on the SDist should work, and you shouldn't have to call unique custom commands on setup.py for things to work, but that's for later). For now, try removing that line then adding CIBW_BEFORE_BUILD: "python setup.py build_clib"
. (or the full line, but build_ext should get called by pip during the build).
Edit: that's an environment variable, if not clear from the above, so it goes in env:
on the step.
- name: Build wheels
working-directory: ${{github.workspace}}/pkg/pypi
env:
CIBW_BEFORE_BUILD: python setup.py build_clib
run: |
ln -s ../../ native
cp -f native/src/bindings/python/*.py libtest/
python -m cibuildwheel --output-dir wheelhouse
So these are great clues and I'm getting closer to a solution. However I'm still missing some nuance here.
My package directory is called pkg/pypi
. This is where my module, setup.py
, etc live. Crucially there are no sources yet. So I must:
setup.py
via native/
) The reason for this is that much of the sources are generated and maintained elsewherepkg/pypi
python setup.py build_clib
python -m cibuildwheel --output-dir wheelhouse
setup.py build_ext
at some point.The problem is that C/C++ sources are not found during steps 2
and 3
and thus an empty .so
is generated.
The following makes sense according to my current understanding but it does not seem to work:
name: Build
on: [push]
jobs:
build_wheels:
name: Build wheels on ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
matrix:
include:
- os: ubuntu-latest
steps:
- uses: actions/checkout@v2
with:
submodules: recursive
- uses: actions/setup-python@v2
- name: Install cibuildwheel
run: python -m pip install cibuildwheel==1.10.0
- name: Build wheels
env:
CIBW_BEFORE_BUILD: "python setup.py build_clib"
run: |
ln -s $(pwd) pkg/pypi/native
cd pkg/pypi
cp -f native/src/bindings/python/*.py libtest/
python -m cibuildwheel --output-dir wheelhouse
- uses: actions/upload-artifact@v2
with:
path: ./pkg/pypi/wheelhouse/*.whl
I realize some of these questions may lie outside of the scope of the project but I do appreciate the help anyway.
P.S. When I issue python setup.py build_clib --verbose build_ext -i --verbose
manually before the cibuildwheel
step my sources are correctly found and symbols make their way into the final .so
, but this then leads to symbol conflicts.
So my theory is that the cibuildwheel
step somehow doesn't map my source tree into the containers in the way that I'm expecting.
But why you need separate step for build C/C++ libraries? Why it cannot be build with pip wheel
? (python setup.py bdist_wheel
)
But why you need separate step for build C/C++ libraries?
Because if the C and C++ sources are added to the same library in setup.py
the flags from the C++ compilation steps are erroneously applied to C sources. Building in a separate step as a clib
appears to be the only workaround for that behavior.
I think I solved my issue. For anyone else that comes across this in the future:
Basically, by cd
ing into my package directory before running the cibuildwheel
step I was implicitly telling cibuildwheel
that the new current directory is all that needs to be included in the container (which then makes accessing my source files impossible). To rectify this I use the package_dir
option like so and remove my previous cd pkg/pypi
:
python -m cibuildwheel pkg/pypi --output-dir wheelhouse
I appreciate the help, everyone. I'll close this ticket once I confirm there are no outstanding related issues.
Hello everyone,
Thanks for creating such a potentially time-saving tool. Unfortunately I cannot seem to get it to work. I know it must be something I'm doing but I can't pin it down.
I have an extension module with C (and) C++11 components. I specify them in
setup.py
and can build them (and the entire wheel) just fine on my local machine. However, when I try to use this tool I get the following error:I've tried adjusting the images to (
manylinux2014
, etc), I've tried installing specific versions of python and even limiting the build to things likeCIBW_BUILD: cp39-*
just to simplify things, but it always fails. I've tried to adjust the verbosity level to get more information about these allegedly incompatible symbols, but nothing.It seems like when I lift the restrictions on the python version being installed and the
CIBW_BUILD
setting it will build many wheels just fine but when it gets to the last wheel it fails, and that last wheel can be any arbitrary version depending on what I set my restrictions to.Any hints? Thanks.