Closed tacaswell closed 4 years ago
You can also take a look at https://github.com/pyFFTW/pyFFTW-wheels as reference.
ah, they are also using multibuild, that is how we build the Matplotlib and h5py wheels as well.
@ashwinvis you want to take a crack at setting that up? I think we would have to get @matthewbrett to set up the RSA key for us? I have the power to create new repos in this org, but would want to check with @kmike before doing that.
FTR, I like how the builds are organized at https://github.com/scrapinghub/python-crfsuite (thanks to the work of @datamade folks). It is all in the same repo, and wheels for OS X / linux / Windows are built on git tags. Setting up Windows builds required additional AppVeyor setup; OS X and Linux were covered by Travis CI.
Feel free to create an additional repo though, if you decide that's the best way forward.
@kmike That does look nice! I'm inclined to do this "by hand" today and then adopt an automated solution like this going forward. Do you approve the release plan otherwise?
Also, in small world, I crossed paths with some of the datamade people through the chi hack night!
Do you approve the release plan otherwise?
Yes.
I prefer not to require cython at install time: it is an additional dependency; cython upgrade could break the package, as tests are run at release time, while package can be installed with a different Cython version - older or newer. But I can understand reasons for having it like that, at least it is simpler.
Best of all world seems to be generating C files on CI for the release, but without good automation of releases it may get us into a situation similar to this Python 3.8 problem, when you need to regenerate files to make datrie work with more recent Python.
I would argue that if the behavior of datrie changes based on the cython version that is a cython bug and treat that the same as if a patch version of python introduced a regression, but I see your point.
If you include the c-files, but use cython if it is there you are in the same boat as requiring cython and build time in terms of bugs and are still exposed to the issue with the cpython internal changing if you don't have cython. By not including the c-files you only expose your self to 1 failure mode instead of 2 ;). Given that cpython is moving to a 12mo cadence I am personally more worried about cpython moving under us that (hopefully transient).
For reference, h5py has been requiring cython as a build time dependency from at least 2015 (in part because we will build against your local hdf5) and have not to my knowledge had issues with cython breaking under us.
One advantage of hosting the wheel machinery outside is that you don't have to version-rev to get wheels for new versions of Python when they come out.
Fair enough :)
For reference, h5py has been requiring cython as a build time dependency from at least 2015 (in part because we will build against your local hdf5) and have not to my knowledge had issues with cython breaking under us.
Cython 3.0 is going to change default language version from Python 2 to 3; this changes semantics of .pyx code, and can break it. I'm not sure datrie is currently ready for that. It'd be good to set language_level explicitly.
ah, good point re the language level. I'll add that to #80 .
Update: I have located an osx machine I can use (but have not started to try to use it yet) and am currently playing with the manylinux1 docker (~everything but the py38 one is working~ :sheep: the py38 build was finding local build products rather than the installed version from the wheel).
Have also heard back from CGohlke and this change should not break any of his tooling.
Assuming you have conda installed the following will build the mac wheels (basically copying the manylinux demo script). I tried to use python.org python, but could not figure out how to install the pkg file to a local directory, it was not clear to me that it would be happy with installing all of them and I did not want to mess up my wife's laptop which I was using for this (I actually ran this against my fork and branch to test)
#!/bin/bash
set -e -x
TAG=0.8.2
mkdir snakes
conda create -y -p snakes/py27 python=2.7
conda create -y -p snakes/py35 python=3.5
conda create -y -p snakes/py36 python=3.6
conda create -y -p snakes/py37 python=3.7
conda create -y -p snakes/py38 python=3.8
pushd snakes/
git clone https://github.com/pytries/datrie
pushd datrie/
git checkout $TAG
git submodule init
git submodule update
popd
## this is a bit questionable (pip installing into the root conda env!)
pip install delocate
# Compile wheels
for PYBIN in ./py*/bin; do
"${PYBIN}/pip" install -r datrie/dev-requirements.txt
"${PYBIN}/pip" wheel datrie/ -w wheelhouse/
done
delocate-wheel wheelhouse/*whl
# Install packages and test
for PYBIN in ./py*/bin/; do
"${PYBIN}/pip" install datrie --no-index -f wheelhouse
"${PYBIN}/python" -m pytest datrie
done
popd
docker run --rm -e PLAT=manylinux1_i686 -v `pwd`:/io quay.io/pypa/manylinux1_i686 linux32 /io/travis/build-wheels.sh
docker run --rm -e PLAT=manylinux1_x86_64 -v `pwd`:/io quay.io/pypa/manylinux1_x86_64 /io/travis/build-wheels.sh
will build all of the linux wheels.
I noticed one oversight after I merged #80 that we were not excluding the cython build product from the Manifest.in
. I added that to directly on master for expediency.
I'm going to close this issue as 0.8.2 with wheels for mac and linux have been pushed to pypi :tada:
I will upload the windows wheels when they show up at https://www.lfd.uci.edu/~gohlke/pythonlibs/#datrie
Normally I don't like to upload to pypi until we have all of the wheels built, but as there are not currently windows wheels this does not make things any worse that it is now.
The remaining steps to the release process were:
twine upload -s dist/*.tar.gz
twine upload dist/*whl
and I have uploaded the wheels from https://www.lfd.uci.edu/~gohlke/pythonlibs/#datrie to pypi
This is my plan for ~the next day to get a py38 compatible version of datrie up on pypi