Open warsaw opened 4 years ago
https://github.com/indygreg/PyOxidizer/pull/199 should solve the problem
@jayvdb That does help -- I can get farther. But I still have insurmountable, mysterious, and undebuggable packaging failures. For example, one of my dependencies can't be found in our internal PyPI mirror even though I can confirm it's there, and I can even explicitly install the wheel into the venv_path
(as given in #199).
For whatever reason, pip still can't resolve that dependency, and given PyOxidizer wants to "clean things up" on failure, it's nearly impossible to investigate what's going on.
PyOxidizer's hacking of distutils
and its insistence to not use binary wheels creates packaging incompatibilities like this. I plan to loosen the restriction around how 3rd party Python packages are consumed - allowing PyOxidizer to work with more traditional Python distributions (e.g. a standalone Python shared library so that pre-built 3rd party packages just work. But this will necessitate moving away from single file executables. I want PyOxidizer to be flexible enough that any Python application can use it. So far, most of my time has been spent on supporting single file executables because that was an unsolved problem that I wanted to solve.
@warsaw out of curiosity for this use case, do you care more about single file executables or using PyOxidizer as a Python packaging tool? What exactly do you want PyOxidizer to do? e.g. would it be acceptable if it put all the .py
/.pyc
files into a single file but still had standalone .so
/.dll
files for extension modules?
Also worth noting that on Windows, PyOxidizer could still leverage traditional Python linking strategies and still achieve single file executables since it is possible to import a DLL from a memory address on Windows. I suspect we'll eventually implement this, as recompiling various extensions on Windows is a PITA and the path of least resistance will be to extract Python resources from binary wheels and stuff them all in a PyOxidizer-managed binary.
@indygreg I kind of care about both :)
At work we definitely want single file executables. The tricky bit is that I have to somehow integrate the distutils hacks we do with the ones PyOxidizer does, so that both functionalities work.
OTOH, for more personal projects, just making it work would be good enough, thus the loader unpacking of the shared libraries on demand (rather than the pex/shiv way of unpacking the whole zip on first execution).
Oh, and we are on Linux and macOS, so yeah we're limited to dlopen()
on those platforms. :(
I think using PEP 517/518 pyproject.toml
should solve this:
[build-system]
requires = [
"setuptools", "wheel", "distgradle"
]
build-backend = "setuptools.build_meta"
In my experiments with PyOxidizer so far, if you put that in your pyproject.toml
and this into your pyoxidizer.bzl
's make_exe
:
exe.add_in_memory_python_resources(dist.pip_install([CWD]))
then Pip will notice pyproject.toml
, pip install setuptools wheel distgradle
, and proceed to run your setup.py
build script like normal.
FWIW, I think this is a much safer solution than installing an entire virtual environment. (That said, I'm still having trouble getting around enthought/comtypes#199 with that technique. I may open a separate issue to address that.)
This is admittedly a non-standard use case, but it breaks PyOxidizer on our internal libraries. For integration purposes we have something like the following at the top of our internal library
setup.py
files:This works for us because our build system guarantees that
distgradle
will be installed in the venv before thissetup.py
is executed. However, this breaks PyOxidizer because even though I can add aPipInstallSimple
rule to installdistgradle
before thePipRequirementsFile
rule that pulls in all of our internal dependencies, IIUC PyOxidizer first builds dependencies in a temp venv because it might have to manipulate the built packages for static linking and such.Thus
PipInstallSimple(package='/path/to/distgradle.whl')
will put distgradle in the target venv but it won't be available in the temp venv for our internal library build. For the same reason, addingdistgradle
to myrequirements.txt
also doesn't work.While more rare, this can be a problem with PyPI packages, if they also have these kinds of implicit pre-install dependencies. I remember seeing such packages, but don't have an example handy. Another possible problem would be packages which build differently depending on what other packages are already installed (e.g. conditional dependencies, etc).
I was thinking about maybe adding a feature for "pre-installs" which would land in the temp venv before the rule would run. While ugly, it probably wouldn't be too terrible.
Another option would be to clone the target venv, or merge the target venv with the temp venv before you try to build the individual dependency. Merging and cloning venvs though is fraught with peril.
I'm open to other suggestions, and I'm motivated to help implement a solution (if it's not to complex).