Closed spenczar closed 1 year ago
The build failure here is fixed in c920192 and bc16af4. I'll wait for that branch to land and rebase this one.
I'll make the README changes here and merge.
I'm wondering if we are doing this a lot if we shouldn't instead start publishing namespaced forks to PyPI (I have a team account setup). So we publish something like ai-healpy and then that way we can include it as a simple dependency and the pip server will figure out how to serve the right binary wheels.
Yeah, maybe. I don't think this will work through transitive dependencies: if something else depends on healpy
, then pip will try to install both packages which could get very confusing very quickly. We would need to then fork that package and build our own ai-
edition. For example, we'd need to build a separate package for thor, since it depends on healpy Running a private PyPI, and using it as the index in pip
commands, would avoid that problem. Maybe adding a PyPI to garden would not be too hard, I don't really know.
In working through #59, I wanted to make it so that you could run
pre-commit
in a dockerized workflow. Ideally, you'd be able to do something likedocker-compose run precovery pre-commit run --all-files
, and it'd scan all files, just like you'd ask.But I found this was too slow when using x86_64 emulation on my macbook. In particular, running mypy checks across all files took more than an 40 minutes (!!!). I ended up killing the process and giving up.
The only reason we're using x86_64 emulation right now is that healpy doesn't have a binary wheel for aarch64 released yet. The work to make aarch64 wheels was done in https://github.com/healpy/healpy/pull/819, but there just hasn't been a healpy release since then. So, I decided to copy the pattern set in #57: I made a B612 fork of healpy, and used it to host built wheels. I set up the Dockerfile to manually install the wheel.
With this, running mypy checks takes a few seconds, so pre-commit from within docker is feasible.