Closed enzbus closed 11 months ago
Hi Enzo! Yes, you're right (although currently the PR is blocked on scipy not being available via conda for python 3.12).
If you know how to fix this and / or upgrade the build system I would be happy to accept the help!
Ok, here's what I have on my machine. Output of get_info
on 3.11
>>> get_info('blas_opt')
{'extra_compile_args': ['-I/System/Library/Frameworks/vecLib.framework/Headers'], 'extra_link_args': ['-Wl,-framework', '-Wl,Accelerate'], 'define_macros': [('NO_ATLAS_INFO', 3), ('HAVE_CBLAS', None), ('ACCELERATE_NEW_LAPACK', None)]}
>>> get_info('lapack_opt')
{'extra_compile_args': ['-I/System/Library/Frameworks/vecLib.framework/Headers'], 'extra_link_args': ['-Wl,-framework', '-Wl,Accelerate'], 'define_macros': [('NO_ATLAS_INFO', 3), ('HAVE_CBLAS', None), ('ACCELERATE_NEW_LAPACK', None)]}
And output of show_config
(both on 3.11 and 3.12, from newest numpy)
>>> np.show_config(mode='dicts')['Build Dependencies']
{'blas': {'name': 'openblas64', 'found': True, 'version': '0.3.23.dev', 'detection method': 'pkgconfig', 'include directory': '/opt/arm64-builds/include', 'lib directory': '/opt/arm64-builds/lib', 'openblas configuration': 'USE_64BITINT=1 DYNAMIC_ARCH=1 DYNAMIC_OLDER= NO_CBLAS= NO_LAPACK= NO_LAPACKE= NO_AFFINITY=1 USE_OPENMP= SANDYBRIDGE MAX_THREADS=3', 'pc file directory': '/usr/local/lib/pkgconfig'}, 'lapack': {'name': 'openblas64', 'found': True, 'version': '0.3.23.dev', 'detection method': 'pkgconfig', 'include directory': '/opt/arm64-builds/include', 'lib directory': '/opt/arm64-builds/lib', 'openblas configuration': 'USE_64BITINT=1 DYNAMIC_ARCH=1 DYNAMIC_OLDER= NO_CBLAS= NO_LAPACK= NO_LAPACKE= NO_AFFINITY=1 USE_OPENMP= SANDYBRIDGE MAX_THREADS=3', 'pc file directory': '/usr/local/lib/pkgconfig'}}
On my machine they pick up different blas and lapack; I'm not sure how to proceed.
Scipy for python 3.12 is now available in conda-forge :)
Just saw this in https://github.com/conda-forge/scs-feedstock/pull/35 - the use of numpy.distutils
makes it impossible to support this on 3.12 at the moment. The numpy.distutils
package has been deprecated and will never see a release for Python 3.12. If you're relying on this in an essential way, you'll probably have to switch build backends. Scipy moved to meson for example, and so did numpy, pandas etc. It's not a small undertaking though...
(@enzbus) On my machine they pick up different blas and lapack ... Output of get_info on 3.11 ... (shows use of Accelerate) Output of show_config .. from newest NumPy ... (shows use of OpenBLAS)
The code that uses get_info()
in setup.py
and here is fragile. It will give you the information used to build numpy. But what if numpy is installed from a wheel built on a different machine? I am surprised this workflow can successfully build the project, unless your CI/developer workflow forces building NumPy from source before building scs-python.
what to do: One person's opinion. The correct thing to do here is move to a different build system with built-in mechanisms for finding blas/lapack (meson is what SciPy/NumPy settled on). An alternative to what you have now, still using the setup.py
workflow, would be to require use of pkg-config to find a working BLAS/LAPACK. You could use subprocess.check_output(['pkg-config', blaslib, '--cflags/libs/modversion'])
and such to detect the flags.
Thanks @mattip ; I agree with you that we should migrate build system, as this code is quite old. I know it's a lot to ask, but if numpy.show_config
could also show the flags and macros used by meson (which would be useful in general for numpy
users and the ecosystem) we would be able to keep the existing build system.
Yes I agree 100% we should migrate this build system. Does anyone know how to do so? I am very happy to accept PRs on this.
I'm looking into it, some points:
setup.py
directly like it's done here is deprecated; the way to do it nowadays is pip install .
-Werror
flags to make sure deprecation warnings break the Github CI in the future (numpy.distutils
has been deprecated for years)setup.py
is, I guess, deprecated; I'll try to fix the PyPI packaging and uploading partI think you have two good options to migrate:
meson-python
I'd normally suggest (1), because that's what NumPy/SciPy/Pandas et al did and it has the features you need. However, I see SCS, added as a submodule in this repo, uses CMake already. If you're building SCS as part of the scs-python
build, it probably makes sense to go with scikit-build-core
.
We should add
-Werror
flags to make sure deprecation warnings break the Github CI
Just to make sure: only in one or more CI jobs, not in the actual build definition (that would be quite bad and cause extra breakage with newer compilers).
Also note that -Werror
is for compilers only, it would not catch deprecation warnings emitted from Python. For that, you'd filterwarnings = error
in pytest.ini
(assuming you run tests with pytest).
The current strategy doesn't compile SCS submodule separately to link against, it just compiles the whole thing together, which at the time I thought was simpler. It sounds like meson
is the way to go then, however I have absolutely no experience with it!
I'm looking into meson
, I haven't done C packaging in a while. It looks like a no-brainer. All core graphics Linux libraries use it, and even GNOME. And it's written in Python! Do you want to do it @bodono? Your setup.py
would translate to a few tens of lines of meson.build
(if you want to keep all the optional builds), or about 10 if you go with plain packages only. Also I would suggest to trim down the github CI yaml, in my experience that's the hardest part to maintain.
Hello @rgommers; can I ask you for quick feedback / obvious mistakes you see here: https://github.com/bodono/scs-python/pull/70/files#diff-30d8f6be6320feeacf686be94f48c70869b52630e01ea625f0f15adc0d57c3e4 ? We're trying to get a basic meson build up.
Sure, having a quick look now, and can test later today.
Thanks again for sharing your knowledge @rgommers !
SCS
only uses one or two Lapack functions (eigendecompositions). Do you think it would make sense to put openblas as a submodule and tweak the meson.build
to find them and compile them into SCS
directly (we only need the relevant codepaths), as a fallback if no linkable Blas is found? I guess that would simplify distributing a lot.
I would take them from reference LAPACK then, not OpenBLAS. NumPy does exactly that - it has f2c
'd versions of reference LAPACK versions which it will fall back to when no suitable BLAS library is found. The annoying thing is the silent and large performance regression (can be up to 50x or so). In NumPy 1.26.0 we added a -Dallow-noblas
flag with a default of false
to prevent the silent regressions. That then of course caused issues for folks who wanted to build from source (e.g. in CI when building their own wheels on niche platforms), but couldn't without passing a flag. Which pip
/cibuildwheel
can make difficult for you.
There's no winning here really. I'd suggest that if the performance impact isn't a complete show-stopper, it's a good idea to vendor the routines you need as a fallback. And put in a safeguard to ensure that you never hit the fallback when you build wheels to distribute.
Hello @bodono; PR #68 is not enough to build
scs
on 3.12. The culprit is this function:which uses
numpy.distutils
, not available on 3.12. One workaround could be to usenumpy.show_config()
instead, which gives you some blas and atlas information, but probably not all you need. I'm happy to help.