Closed The-Compiler closed 9 years ago
in my oppinion the only acceptable way of vendoring pluggy would be an automated variant we use at distribution packaging time that is very easily and in a controlled manner undoable
i have some plans to render general vendoring unnecessary, but those have long term timeframes - so we need to do it for pytest in any case
I think the reason is that it is very immature at this point: pluggy is at 0.3.0 now, but 0.4.0 might be backwards incompatible and would break all pytest installations out there, and it isn't desirable to generate a new pytest release just to comply with the changes in the new pluggy version. One might pinpoint pluggy to a specific version in pytest, but I think it is desirable to be able to create new features (possibly backward incompatible ones) when working on another project which uses pluggy (devpi for example), and that would pose a problem for users which use both projects in the same environment, as both would have to be pinpointed to incompatible versions.
but I think it is desirable to be able to create new features (possibly backward incompatible ones) when working on another project which uses pluggy (devpi for example), and that would pose a problem for users which use both projects in the same environment, as both would have to be pinpointed to incompatible versions.
This is a good point, I didn't think of that. Okay, I agree, make sense.
I'm certain Debian would fuck everything up and unbundle pluggy. This happened with python-requests and urllib3, and because requests exposed some of urllib3's API as its own, the results were not pretty.
On 25 August 2015 14:23:12 CEST, Florian Bruhin notifications@github.com wrote:
but I think it is desirable to be able to create new features (possibly backward incompatible ones) when working on another project which uses pluggy (devpi for example), and that would pose a problem for users which use both projects in the same environment, as both would have to be pinpointed to incompatible versions.
This is a good point, I didn't think of that. Okay, I agree, make sense.
Reply to this email directly or view it on GitHub: https://github.com/pytest-dev/pytest/issues/944#issuecomment-134568839
Sent from my phone. Please excuse my brevity.
I'm certain Debian would fuck everything up and unbundle pluggy.
What do you mean? They would literally just change pytest's code to use the system's pluggy installation?
Yes.
https://github.com/kennethreitz/requests/pull/2567
On 25 August 2015 15:52:15 CEST, Bruno Oliveira notifications@github.com wrote:
I'm certain Debian would fuck everything up and unbundle pluggy.
What do you mean? They would literally just change pytest's code to use the system's pluggy installation?
Reply to this email directly or view it on GitHub: https://github.com/pytest-dev/pytest/issues/944#issuecomment-134592763
Sent from my phone. Please excuse my brevity.
Oh my. :worried:
Please don't bundle pluggy. Fedora packaging guidelines in general forbid bundling (there are reasons for doing so). Exceptions are possible in principle, but need to be justified, and the process puts extra burden upon packagers,
distributions have a history of breaking pip all the time by de-bundling
the bundling is done specifically to ensure function in face of different installed versions of that library, since python is limited
@thmo why is ensure function in face of different unsupported versions of that library not a sufficient reason?
also history shows that distributions pretty much fail to propperly do the debundleing on regular basis
Because security is valued more than making sure that anything "functions in face of different unsupported versions" at least in the case of Fedora. The latter is a manual step, e.g. in the %check
phase in the case of rpms. As soon as a security flaw is found there is only one place to change and make sure that the update gets out as fast as possible. With bundling it can become very ugly to find all bundled flaws in a reasonable time.
Nevertheless, you have good reasons also in favor of bundling such as making sure that the right version of the dependency is available. For ipython, we found the compromise to bundle in such a way that unbundling is easy and is just a matter of rm -rv
the right subfolder/files.
Please don't bundle pluggy. And when you do, please make it easy to unbundle so that we can break history and all distributions can unbundle on a regular basis.
We picked a RM based scheme, I expect distros to horribly break things lust like they do on regular basis for pip and requests
I see. It seems at least pip is not yet unbundled because of various reasons.
Sounds like you anticipate certain breakages, maybe they can be an TESTENV=unbundled
(At least it is then ran by the packager on the import into the distro)?
thats up for later, there are some ideas to solve bundle/debundle properly in pypa, but that needs changes of setuptools and pip and distro policies for debundled multi-versioned libraries
basically just debundleing things without having a multiversion import system in place is just naive tinkering on the side of the distros - its impossible not to expect strange failures from such behavior - because the bundleing was specifically done to avoid certain issues that are NOT solved and the distros certainly don't seem to want to solve that proper
as long as the distros to half-assed de-bundleing breakages are simply to be expected as the norm we are lucky redhat is not quite as bad as debian there (remember the utter failure wrt ssl weak keys due to downstream tinkering)
i am very much strongly opposed to distro tinkering, because very often it is done without involving upstream or a deep understanding of reasons just because there are rules however upstream still gets all the backlash on packages broken on distros, this is the technical equivalent of a unintended sociopath troll pulling pranks on people and putting the blame on others
for example the pip developers by now tell every user with pip issues on distros to always uninstall the broken distro packages and use upstream
if distros want debundled packages they should provide it in a reliable way to upstream, not in a tinkered mess that's only in downstream
I was reviewing this ticket in light of #2641.
that would pose a problem for users which use both projects in the same environment
I would like to highlight that Setuptools did once address this issue. It was called multi-version installs. Pip explicitly and purposefully dropped support for multi-version installs because it declared that no-one was ever going to need that feature. And yet, we find ourselves now bypassing the packaging infrastructure and bundling packages for just this purpose.
In this mail, @nicoddemus said:
I've thought about this, and I wonder what the rationale is - I actually see some reasons why it should not be vendored:
cc @hpk42