bodono / scs-python

Python interface for SCS
MIT License
41 stars 33 forks source link

Build wheels in GitHub actions #25

Closed mtreinish closed 3 years ago

mtreinish commented 3 years ago

This PR adds CI jobs in github actions to build release wheels for linux, macOS, and windows using github actions. The idea behind it is that at release time we can use github actions to trigger wheels that are distributed on pypi with precompiled binares preventing the need for users to compile scs locally for their environments every time they go to install scs. It uses cibuildwheel to perform the wheel builds which automates most of the build process so we create binaries for all the environments.

Right now the PR is running these builds on every commit, but can/should be switched to just be triggered by tags/releases. They also can be setup to automatically upload the binaries to pypi with twine if desired (this is typically the configuration I used for releases).

TODO:

Right now the windows builds are not working all the tests return NaN instead of the expected result: https://github.com/mtreinish/scs-python/runs/1524626020?check_suite_focus=true#step:9:73 The lengthy git log for the PR is from all of my attempts to get this working on windows. I tried using cibuildwheel at first but hit this same issue and decided to try switching to use conda and python directly to have more control over the build process. This will need to be fixed so we can also distribute precompiled binaries for windows users.

bodono commented 3 years ago

This is great, thank you for putting this all together!

I like the idea of automatically uploading the binaries, that makes everything very simple. I will think about whether every commit or just releases is better, for now what you have is fine.

For the windows builds, it looks like the SDP problems are the failing ones. For SDPs we need to link against blas and lapack libraries (for the eigendecomposition routines). I notice a few error messages in the logs that suggest it can't find the lapack / blas libs, eg, here. The way SCS tries to link against those libs is by using numpy's `get_info' method. It seems that in the case you posted numpy thinks it is linked against MKL, but that library is actually missing. There are a couple of options here, one is to use a different numpy wheel that is hopefully pointing to the right place. Another is to install MKL (I think there is a conda version) and that might just work out of the box. The other is to manually install lapack / blas and then point SCS to the manually installed ones (via the environment variable used here, though I'm not sure how that would work in the distributed wheel.

bodono commented 3 years ago

Hi Matthew, do you want me to pick this up (I don't really have cycles now but might soon hopefully), or are you planning on finishing it? I really want to get this checked in as I think it's really useful!

bodono commented 3 years ago

Hi @mtreinish , are you still working on getting this merged in?

mtreinish commented 3 years ago

Hi @mtreinish , are you still working on getting this merged in?

Yeah, I was heads down on other work for a while but recently I picked it back and have been trying to fix the windows builds (which are the only thing not working still) but not having much luck.

bodono commented 3 years ago

Ok great, thanks for putting in the work. If you ever feel like you want to give up on the windows builds and get this merged anyway, let me know!

bodono commented 3 years ago

I finally managed to get windows working with github actions and passing tests. The yaml file is here. Does this help you overcome the windows issues at all?

mtreinish commented 3 years ago

@bodono thanks that was enough of a hint I was able to get the windows builds working: https://github.com/mtreinish/scs-python/runs/2766833786?check_suite_focus=true

I think this is ready for review now. There are 2 small things though, first I couldn't get win32 binaries to be built because of https://github.com/conda-incubator/setup-miniconda/issues/166 I couldn't initialize a 32bit conda env. If that ever gets fixed it would be good to add 32bit builds since a lot of windows users end up with a 32bit version of python (even if they're on a win64 OS). The second is right now the wheels are built and just uploaded to the gha artifacts for the workflow so to publish the wheels you'd have to download them from the artifacts and then upload to pypi manually. I personally typically setup the jobs to also publish to pypi. For example: https://github.com/Qiskit/retworkx/blob/main/.github/workflows/wheels.yml#L65-L69 (although that only runs on tags not every commit). I can adjust this however you prefer

bodono commented 3 years ago

Oh awesome, thanks for all your work on this. On the second point, I agree we should publish to pypi on tags, that seems very useful. I noticed cvxpy does the same thing here.

h-vetinari commented 3 years ago

Nice progress @mtreinish! Might have to merge master or rebase though - appveyor is gone now.

If that ever gets fixed it would be good to add 32bit builds since a lot of windows users end up with a 32bit version of python (even if they're on a win64 OS).

I'd be interested where you get that impression (or data!). Confusingly, the API that sys.platform shows is Win32 even on 64bit python-runtimes. 32bit builds have not been built by anaconda for years already, the amount of relevant 32bit-only windows hardware out there is probably less than the python 2 userbase these days (this was an issue until very recently why PyPy couldn't be built for windows in conda). I see that cpython still provides binaries, but couldn't find download stats because pypistats doesn't distinguish the bitness of windows.

mtreinish commented 3 years ago

Nice progress @mtreinish! Might have to merge master or rebase though - appveyor is gone now.

If that ever gets fixed it would be good to add 32bit builds since a lot of windows users end up with a 32bit version of python (even if they're on a win64 OS).

I'd be interested where you get that impression (or data!). Confusingly, the API that sys.platform shows is Win32 even on 64bit python-runtimes. 32bit builds have not been built by anaconda for years already, the amount of relevant 32bit-only windows hardware out there is probably less than the python 2 userbase these days (this was an issue until very recently why PyPy couldn't be built for windows in conda). I see that cpython still provides binaries, but couldn't find download stats because pypistats doesn't distinguish the bitness of windows.

This comes up all the time for projects I maintain because most of my user base doesn't use conda. There was a time I made the same assumption and didn't bother with wheels for win32 and would constantly get issues opened about pip install failing. I assume they're running 64bit windows bu python with a win32 binary (but I think there are still a fair number of win32 users out there too). I don't have hard numbers to back this up, just anecdotal evidence from issues being opened and the cause coming down to a win32 binary being used (which is why I try to build win32 binaries everywhere).

bodono commented 3 years ago

Sorry I took so long on this, LGTM.