Closed larsoner closed 3 months ago
cc @cbrnr in case you want to know what I changed
@mscheltienne it might be worth modifying the CI structure to take the cibuildwheel
outputs and testing those wheels. Would ultimately save on build time and really I think it's closer to what you really want to test (i.e., closer to what users will actually see). Then you could just have a thinner .yaml
to test what devs would do, like use the latest Python and system version to build and install or whatever.
I would also get rid of uv
for now because of this:
You cannot use uv currently on Windows for ARM or for musllinux on s390x as binaries are not provided by uv.
90% sure it's the cause of Windows failures. I'll disable it for now. (It's also nicer to stick with whatever cibuildwheel
uses for defaults!)
@mscheltienne green and ready for review/merge from my end. Now the Release
step has:
Checking dist/antio-0.1.0-cp38-abi3-macosx_10_9_x86_64.whl: PASSED
Checking dist/antio-0.1.0-cp38-abi3-macosx_11_0_arm64.whl: PASSED
Checking dist/antio-0.1.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl: PASSED
Checking dist/antio-0.1.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl: PASSED
Checking dist/antio-0.1.0-cp38-abi3-win_amd64.whl: PASSED
Checking dist/antio-0.1.0.tar.gz: PASSED
Which looks like it should work once you cut a new release, assuming you set up trusted publishing.
Great job @larsoner! Quick question, is it possible to build for win-arm64
yet? I think there will soon be some demand for that architecture, and it's maybe best to start providing it now.
It feels like I was both close and far at the same time :expressionless: I understand the changes, but could you provide details on the following points:
cibuildwheel
pipeline?test-command
was kept in pyproject.toml
thus pytest
is run already at the end of the build process by cibuildwheel
. Is this duplicate necessary?cp312-*
as per the configuration in pyproject.toml
and labelled cp39
as per the get_tag
method in bdist_wheel_abi3
, and they can thus be installed on python 3.9 to 3.12. Correct? If so, in which case is it necessary to have one wheel per python version and have a tag e.g. cp39-cp39
?${{ strategy.job-index }}
to the artifact upload tag for the aarch64
wheels?@cbrnr cibuildwheel
says the following about CIBW_ARCHS
:
On Windows, this option can be used to compile for ARM64 from an Intel machine, provided the cross-compiling tools are installed.
But I don't know how to set it up on a github runner. Alternatively, a windows arm64 runner seems to be coming: https://github.blog/news-insights/product-news/arm64-on-github-actions-powering-faster-more-efficient-build-systems/ and https://github.com/actions/runner-images/issues/768
Besides the points above, trusted publishing is now setup and I'll have a look at this script from @behinger to create a proper Raw
object with support for preloading and will add short test files with 64 and 128 channels recordings using version 1.8.2,1.9.2 and 1.10.2 of the eego
software.
I think I can answer some of these questions.
What is the repair wheel step actually doing? This part is cryptic to me. More importantly, why does it need to be configured with one tool per platform and why is it not part of the default cibuildwheel pipeline?
See https://cibuildwheel.pypa.io/en/stable/options/#repair-wheel-command. It's basically copying and relinking external libs. This is part of the cibuildwheel process, but it can be customized if necessary. These are actually the default values, but with an added pipx run abi3audit --strict --report {wheel}
command. I don't know why this is necessary (maybe it isn't and it just produces more informative output).
The build wheels are build on cp312-* as per the configuration in pyproject.toml and labelled cp39 as per the get_tag method in bdist_wheel_abi3, and they can thus be installed on python 3.9 to 3.12. Correct? If so, in which case is it necessary to have one wheel per python version and have a tag e.g. cp39-cp39?
These builds use the stable CPython ABI, so anything built for e.g. cp39-abi3
will run on Python >= 3.9 (there is in fact no upper bound). See for example here.
Edit: That said, it would perhaps be a good idea to include abi3audit in the runners.
Regarding Windows on ARM64, we could already build it inside the QEMU emulation job I think. Native test runners will be available on GitHub by the end of the year at the earliest.
abi3audit
is run in the repair-wheel part, and now it makes sense why @larsoner changed the defaults from cibuildwheel
which do not include it.
These builds use the stable CPython ABI, so anything built for e.g. cp39-abi3 will run on Python >= 3.9 (there is in fact no upper bound). See for example here.
I missed the abi3
in the name. If I understand correctly, a wheel tagged cp39-cp39
depends on C-code that does not comply with abi3
; and thus needs to be built independently on different python versions with their specific Python.h
header. On the contrary, if the extension complies with abi3
, then we can built it only once and it will work for any version above 3.2 (and we actually overwrite this minimum boundary with cp39
since the python-code does not support anything older anyway).
For windows ARM64, the action for QEMU
emulation fails with:
Error: no matching manifest for windows/amd64 10.0.20348 in the manifest list entries
abi3audit
is run in the repair-wheel part, and now it makes sense why @larsoner changed the defaults fromcibuildwheel
which do not include it.
Ah, of course :smile:! Should have looked a bit more closely!
Great work here! Just a note to the script I made: back then I didn't know much about MNE, your mileage might therefore vary... also I never updated it. I think it is nice someone finally takes proper care of this (honestly) embarrassing situation for ANT
Testing the build wheels makes a lot of sense. Yet, the test-command was kept in pyproject.toml thus pytest is run already at the end of the build process by cibuildwheel. Is this duplicate necessary?
The idea is to build them once but test them multiple places. So far it's just oldest and newest Python (not yet 3.13, still waiting for NumPy and SciPy wheels) on each OS, but you could image testing different OS versions, etc. as well. cibuildwheel
only tests on the build OS or docker image with the version of Python that was used to build the wheel, so it's still a bit limited.
Why add ${{ strategy.job-index }} to the artifact upload tag for the aarch64 wheels?
It is really just a safety to ensure that the artifacts always have distinct names. Not 100% necessary but I've found it useful elsewhere.
Pushed a commit to try building Windows ARM64 following the docs, we'll see if it works!
The idea is to build them once but test them multiple places. So far it's just oldest and newest Python (not yet 3.13, still waiting for NumPy and SciPy wheels) on each OS, but you could image testing different OS versions, etc. as well. cibuildwheel only tests on the build OS or docker image with the version of Python that was used to build the wheel, so it's still a bit limited.
Yes, that's exactly the point in my question. Why even test through cibuildwheel
as this is anyway limited? In this case, with tests that takes < 10s, it doesn't matter. But I guess that if your tests take 10', 30' or even longer, you would want to avoid running the test suite twice.
Thinking about it today, I pushed the [[tool.cibuildwheel.overrides]]
to run the tests only on the emulated platforms that are not covered by the follow-up test
job, but maybe that's also not the best (general) approach.
Why even test through cibuildwheel as this is anyway limited?
At least for Linux at the very least it's nice because then you're testing inside the (quite old!) docker image, which would be annoying to set up afterward.
If you really want to skip it after thinking about it, a better place than pyproject.toml
would be in the ci.yaml
file because there we're guaranteed that we are testing everywhere we want to. And that way you won't disable tests in local debugging / building / dev with python -m cibuildwheel
.
That sounds good, the age of the manylinux
docker image was indeed my remaining concern.
Okay @mscheltienne I think I'm done. Added support for Windows ARM64, though it doesn't get tested at the moment unfortunately.
Looks like windows-arm64 wasn't as simple to support :wink: Thanks a lot, merging; I'm also almost done with the python reader part.
cmake
path bug by:Python3_EXECUTABLE
REQUIRED
so it quits in the configure step instead of cryptically erroring later if Python isn't foundPython3_add_library
instead ofadd_library
so that includes etc. are set automaticallyDevelopment.Module
, except...abi3
mode for the module, so just requireDevelopment.SABIModule
and changePyList_SET_ITEM
(not ABI stable) toPyList_SetItem
(abi stable)pyproject.toml
.so
or library package data (letauditwheel
and related utils pull in and package thelibeep
library)With these changes,
python -m cibuildwheel
succeeds on my Linux machine, as does (afterward):Might need some cleanups on macOS, not sure the abi3 paths / prefix stuff will be correct there.