Closed wintersteiger closed 4 years ago
Sorry, I got demotivated and have been too busy too revisit this issue. Here are my two cents:
We can have the setup.py
outside of the root, but this complicates things. It makes one of two problems:
pip install git+https://github.com/Z3Prover/z3
. This is annoying -- you won't be able to install the "cutting edge" git version very easily.Personally, I would prefer option 3 -- have setup.py in the root directory. It's the standard way to go, and it's analogous to having the debian
directory for debian packages.
I could do either, once I get some time (probably in about 2.5 weeks).
Forgot to address Windows in the previous comment :-)
I don't have a Windows build environment. My understanding is, with a properly-configured windows environment, it should just be a case of executing nmake
instead of make
. I don't have any plans to be exposed to a Windows development env, though, so someone else would have to do that :-)
The "cutting edge" version is also quite "unstable". I would rather have people install one of our (binary) releases than confuse newbies with a bunch of error messages that don't make sense to them.
We have a separate repository for binaries anyways, what if we put the binary-install scripts there (at the root if so desired) and then we have a separate setup.py for the "cutting edge custom compiled" version, which has a long and complicated URL that we publish on a new INSTALL page (wiki or text file in root of the source), so that people can simply cut&paste the link?
Whatever you do, you should publish the source on PyPI. The source should be generated by setup.py sdist
and needs to have the setup.py at its root (and be installable via setup.py install
).
That may be true for a pure Python project like the Z3 Python API, but that's only a small fraction of Z3. For instance, some users will want to use two or more APIs and perhaps the binary as well, and we definitely don't want to publish packages for the cross product of all configurations.
An example of another native/binary code API is llvmlite and they recommend "Under a recent Ubuntu or Debian system, you may install the ...system package... if available." (and similar for other OSs). Are there any popular examples of this type of PyPI package? I also found pySMT which also relies on system installations of solvers and their APIs.
In my opinion, the best place for setup.py would be in src/api/python and it will install only the API bindings, but not libz3.dll/.so/.dylib/.exe and we should rely on a system-wide installation of Z3. We have the same issue with the OCaml API, see issue #34. Whatever we do, we should at the very least make sure that we follow the same strategy for all APIs. It's fine if we print "you now need to install the system Z3 package" at the end of the installation if we can't find libz3.dll/.so/.dylib in the path.
You can either make it so that the Python library works standalone or make it so that it dynamically links against the library, which would effectively be a separate package (in terms of package managers). llvmlite is (I think) an example of the former. There are tons of examples of the latter (take any wrapper around a C library; h5py is the first one that pops into my head).
I personally prefer the latter, especially for something like Z3 that has multiple language bindings, because it makes clear the distinction between the C library and the bindings which link against that library. The advantage of the former approach is that you can make standalone packages for different languages (e.g., if you look at llvmlite
, it is just the Python package; there are no files that get installed in lib/
, everything is in lib/python3.5/site-packages/llvmlite
). For instance, with the latter approach, if someone wanted to install z3 from the sdist source from PyPI they would first need to install z3.dylib (or you can put some hacks in setup.py to try to do that automatically).
Thanks for the pointers @asmeurer! I like that too, we should just make the API a separate project (at least conceptually, but we could even put it into it's own repository).
On my Windows Python 3.5 (with VS2015 in the path), h5py also provides no hints, it just says: ... h5py\api_compat.h(27): fatal error C1083: Cannot open include file: 'hdf5.h': No such file or directory llvmlite complains that cmake isn't installed, also no checks or hints: ... [WinError 2] The system cannot find the file specified
Browsing around a little I found that wheels may be the future of binary distributions in PyPI, at least on Windows and OSX. Maybe we should look into that too?
Chiming in: angr-z3 has the necessary changes to properly install Z3 on OSX for Python 2.7.11 and Python 3.5.1. It is also pip-installable on Linux. Some Windows support is there already and it should work out of the box, however it is untested (I also do not have a Windows box to test).
One of the core reasons why I prefer that a Python package installs its own library: I can take a virtualenv, freeze it, and install it on a different machine and I know that it works. This is a common task and makes reproducibility trivial. However, using system-wide Z3 forces me to make sure that the Z3 Python API I'm installing actually can use the library that I have installed system-wide and that there is no feature that I am calling from Python code that the (older) system-wide library does not support (in fact, Ubuntu and Debian both only provide 4.4.0, which contains a "multitude of bugs"). In addition, I now have to install system-wide packages on top of the Python ones, or I have manually check the library version and install it into the virtualenv. The former is something that -as a normal user- I might not be allowed to do. The latter is ugly and cumbersome. Hence, I strongly advocate that the Python API installs its own version of the library (which would only be used in the Python virtualenv).
Overall, personally, I would go with option 3 as it the intended and practically only supported way (by pip) and because it has a majority of benefits over the other "hacky" solutions that we are discussing (see zardus' and asmeurer's (sdist setup.py in root) arguments).
Out of curiosity, without wanting to step on any toes: Is the plan to support almost all package managers for all languages that have an API or just the main one? What other (main) package managers require files in the root directory? How many files would end up in the root directory at most? Is this maybe less of a problem than we think it is? What would prevent it from being moved in sub-directory at a later point (if the root really becomes clobbered or when better support for this is available in pip), but so that it is easily pip-installable now? Assuming that installs on Windows work, wouldn't this solve the entire issue?
Our (the core team) plan is not to support any packages, but we strongly encourage others to maintain packages for their favourite package systems.
The issue of system vs language package, and somtimes even worse with many systems (brew, macports on OSX) and many languages (all Z3 APIs), is always there. In this particular case, I can very well imagine that we have both packages, a z3py package with everything and a z3py-api package with only the API. Personally, I would just check whether z3/libz3 can be found during installation time, and if they are not found, ask the user whether we should install another binary z3-bin package.
Having both, a system-z3 and a language-z3 is a pain from the maintenance point of view, especially when they are different version, and even worse, if the language-z3 is custom compiled on the users' machines with whatever broken settings they have in their setup.
@cao in general pip/setup.py can't install things to lib (might not have permissions), so the only way to do that is to have a separate build of the library that sits alongside the Python package. I imagine a potential issue there is DLL hell, where one version of the library is installed in lib and another version is installed alongside the Python package.
This really is a limitation of pip (it can't install non-Python packages). Other package managers like brew, apt, conda, etc. can and for them it doesn't make sense to bundle. This is no different from trying to support other Python libraries that are API wrappers with pip/virtualenv (again, I'll give h5py as a simple example).
@asmeurer I'm talking about the library that sits alongside the Python package in the virtualenv or in $HOME/.local/lib/python<version>/
directory (which is exactly what angr-z3 is doing).
How exactly would DLL hell apply here? Only the Python package will load the library that was installed alongside it since it would not be in the default library path. Of course you will have two different versions of Z3 installed (at very different places), but where is the problem in that? You might want to do that anyways for compatibility with older code, to not break code depending on the system and distribution supported version (which might happen if you supply a deb/rpm package for libz3), and one of them is exclusively used by Python (possibly in a virtualenv).
If you are linking against the z3 library from a different place, then you should be aware of the version and location anyways (and for other APIs this does not apply), so DLL Hell does not apply either. I fail to say to see any downsides to this approach.
capstone is an example where the library is shipped/built with the Python bindings. It makes installing it very easy, and makes using it in projects much more reasonable.
For my usecase (installing https://github.com/angr/angr), it's pretty critical that pip can install z3, including the library. My guess is that other systems that rely on z3 are going to have relatively similar requirements.
The two-package solution (one package for z3-bin and one for z3py) would also work for us, as we can make angr depend on both packages, as would an install-time autodetection (and, if it is not detected, building) of libz3, but, IMO, dealing with installation outside of pip should be unnecessary if one wants to use z3 just from Python.
Any news?
I take you are asking for a pip install story? The angr z3-solver pip distribution is from July 4.8.0. I don't maintain it and in fact haven't created pip packages before.
Perhaps a friendly user can help scripting a setup to create the necessary files for a pip distribution. The scripts could be either part of cmake build options or a stand-alone script in the style we use for creating nuget packages. I would certainly take such a contribution and if pushing it to a service that distributes pip packages is easy enough (for nuget distribution we are dealing with package signing barriers) it could be a sustainable process.
If z3-solver works I guess it's fine, it would be nice if you recommended it in the readme?
It's because I assumed I need to install z3 pip package which turns out to be something entirely different.
right, the car z3, the computer z3, experia z3, and the zfs on s3 are all different.
z3-solver is on pip
Restarting the discussion about packages for the Python Z3 API. A previous PR was automatically closed when the unstable branch was removed. Users @zardus, @asmeurer, @zachriggle, @musically-ut, @cao were involved in the discussion.
I understand that the previously crafted scripts were able to install the Z3 package for at least some Linux Python installations, but there were problems with other platforms, at least with Windows.
There was also the question of where to put the packaging scripts; I don't want them in the project root because then we'll have to follow suit for all other package managers and it seems that other projects do indeed store them elsewhere (e.g., Pip install fails to install dependency when setup.py not in root directory).
In the meanwhile, all existing Python scripts have been updated to work with both Python 2.x and 3.x, so that should simplify some of the remaining tasks I hope.
Does anybody know a Python expert who could help us out with setting up the scripts/packages?