Closed mgedmin closed 2 years ago
https://github.com/zopefoundation/meta/pull/153#issuecomment-1239907333 makes me want to add a regression test for this.
I verified that the regression test catches the problem by doing
$ export CFLAGS=-Ofast
$ tox -re py38 -- -pvc -m test_compile_flags
...
Failure in test test_no_fast_math_optimization (BTrees.tests.test_compile_flags.TestFloatingPoint)
Traceback (most recent call last):
File "/home/mg/opt/python38/lib/python3.8/unittest/case.py", line 60, in testPartExecutor
yield
File "/home/mg/opt/python38/lib/python3.8/unittest/case.py", line 676, in run
self._callTestMethod(testMethod)
File "/home/mg/opt/python38/lib/python3.8/unittest/case.py", line 633, in _callTestMethod
method()
File "/home/mg/src/zopefoundation/BTrees/src/BTrees/tests/test_compile_flags.py", line 29, in test_no_fast_math_optimization
self.assertNotEqual(smallest_subnormal, 0.0)
File "/home/mg/opt/python38/lib/python3.8/unittest/case.py", line 921, in assertNotEqual
raise self.failureException(msg)
AssertionError: 0.0 == 0.0
...
Hey, .manylinux-install.sh
has an export CFLAGS="-Ofast $CFLAGS"
. That one comes from .meta.toml
in this repo, and it needs to be updated as well.
The test indicates that @srett was right and -fno-unsafe-math-optimizations does not override -Ofast.
-fno-fast-math doesn't either.
I think we're going to have to switch to -O3 or something like that.
Let's try this again once https://github.com/zopefoundation/meta/pull/155 is merged.
The new test is failing and I have no idea why. There's no sign of -Ofast nor -ffast-math in the build log. Local testing with -O3 tox -r passes for me.
Somebody help?
The new test is failing and I have no idea why. There's no sign of -Ofast nor -ffast-math in the build log. Local testing with -O3 tox -r passes for me.
Somebody help?
BTrees
depends on persistent
, i.e. another extension implemented in "C". Maybe, this comes from the pip
cache and has been compiled with -Ofast
.
BTrees depends on persistent, i.e. another extension implemented in "C". Maybe, this comes from the pip cache and has been compiled with -Ofast.
This sounds very plausible!
I see persistent
also has CFLAGS: -Ofast in the GitHub Actions workflow, but not in .manylinux-install.sh. I think the binary wheels on PyPI don't have the math trap, or persistent
would have been mentioned in https://moyix.blogspot.com/2022/09/someones-been-messing-with-my-subnormals.html. (RelStorage was mentioned there, but I'm pretty sure BTrees doesn't depend on RelStorage.)
I hadn't considered the possibility of GitHub Actions caches containing locally built persistent wheels.
I've wiped the Linux-pip-3.10 and Linux-pip-2.7 caches with
gh api --method DELETE -H "Accept: application/vnd.github+json" /repos/zopefoundation/BTrees/actions/caches?key=Linux-pip-3.10
gh api --method DELETE -H "Accept: application/vnd.github+json" /repos/zopefoundation/BTrees/actions/caches?key=Linux-pip-2.7
so let's see if that affects anything when the build is retried. I need to force-push anyway because I entered the wrong github issue number in the original commit message.
Incidentally, aarch64 builds take forever. .manylinux-install.sh
attempts to work around this by switching to -O1, but it prepends the -O1 flag at the front of CFLAGS, which means the -O3 from the environment specified in GitHub Workflow overrides it back to slow mode again.
Incidentally, I noticed that there are two workflows started with every push: one for the commit, and one for the PR. The workflow skips all the tests for the PR workflow (yay avoiding duplication), but it still builds manylinux packages twice. Bug in workflow configuration?
Looks like the cache clearing helped.
Incidentally, aarch64 builds take forever. .manylinux-install.sh attempts to work around this by switching to -O1, but it prepends the -O1 flag at the front of CFLAGS, which means the -O3 from the environment specified in GitHub Workflow overrides it back to slow mode again.
Nah, I was wrong, there's an export CFLAGS="-pipe"
before invoking .manylinux-install.sh that overrides the environment. But even at -O1 the build takes over 2 hours!
Ok, now this looks good to me.
Supersedes and closes #179. (I wonder if GitHub will parse that?)
I had trouble running the tests locally (no Python 3.5, and my 3.6 build seems to be broken), so let's try it on CI.