Closed callahad closed 2 years ago
"did we ever ship a release with a vulnerable version of that dependency?"
this appears to assume that our releases include their dependencies. We do not vendor our dependencies into the pypi release, with good reason.
I think what you mean is "did we ever ship docker images for a release with a vulnerable version of that dependency?".
It's a bit of a terminology nit, but an important one imho: Synapse releases are distinct from those of our dependencies.
I think what you mean is "did we ever ship docker images for a release with a vulnerable version of that dependency?".
Sure, Docker images or Debian packages. Our built release artifacts / things we're responsible for as a team. Downstream packagers and people directly consuming a Synapse wheel or sdist are outside of the scope of this proposal.
For any given Synapse commit, it is possible to repeatably create identical virtualenvs.
From previous conversations my understanding was that it was important to be able to create an identical virtualenv for any release of Synapse, not any commit. (I think this is related to what @richvdh is saying though.) Those aren't quite the same as described though.
This is fine, and as you note has many advantages, but I do have a counter-requirement, which is:
We should not fall into the trap of only supporting the pinned version of a dependency. To take an arbitrary example, we currently require cryptography>=3.4.7
. That's eight months old now, and being a crypto thing, if we were pinning, we would probably pin to the most recent version. However, doing so would make packaging Synapse for distributions effectively impossible.
I therefore think that any "pinned" versions should be in addition to the current "minimum" versions, where the minimum versions are also subject to a certain amount of testing (as we have presently).
(Exactly how you you would choose whether to use the "pinned" version or the "minimum" version is very much up for debate)
my understanding was that it was important to be able to create an identical virtualenv for any release of Synapse, not any commit.
@clokep: Is this a distinction without a difference? Even if we were only dumping pip freeze
into the first commit of a release branch, we'd end up with that frozen requirements.txt
file back on develop
once the release branch merged back in. This would nominally satisfy the requirement of being able to create identical virtualenvs for any commit from that point on.
Functionally, I suspect we'll prefer to actively maintain the lockfile with the rest of our source during development, rather than only touching it when making release branches, but we'll see how things shake out.
We should not fall into the trap of only supporting the pinned version of a dependency. [...] I therefore think that any "pinned" versions should be in addition to the current "minimum" versions, where the minimum versions are also subject to a certain amount of testing (as we have presently).
@richvdh Happy to go along with that here. The standard practice, across languages, is to separate declared minimums from locked versions (setup.py
vs requirements.txt
, package.json
vs package-lock.json
, Cargo.toml
vs Cargo.lock
, Gemfile
vs Gemfile.lock
, etc.), so we'll be able to do that.
The question of what our minimums should be (or when to raise them) is a separate question, and one we already face regardless of this proposal.
my understanding was that it was important to be able to create an identical virtualenv for any release of Synapse, not any commit.
@clokep: Is this a distinction without a difference? Even if we were only dumping
pip freeze
into the first commit of a release branch, we'd end up with that frozenrequirements.txt
file back ondevelop
once the release branch merged back in. This would nominally satisfy the requirement of being able to create identical virtualenvs for any commit from that point on.
It is different, yes. There's an assumption being made that develop
should have this on it. I'm unsure if I agree with that.
Functionally, I suspect we'll prefer to actively maintain the lockfile with the rest of our source during development, rather than only touching it when making release branches, but we'll see how things shake out.
Perhaps, but that doesn't seem to be one of the requirements.
Perhaps, but that doesn't seem to be one of the requirements.
Let's take the requirement as stated: any commit, not just releases, must have enough information (a lockfile) to reliably recreate identical virtualenvs.
This is a slight leap, but one I believe is inevitable as maintaining a lockfile on the main branch is a prerequisite for availing of Dependabot's features, which are genuinely compelling.
Enabling repeatable builds for arbitrary commits is also necessary for continuous delivery, but that's a distinct concern best left for the future.
There's general consensus that we're fine going ahead with this. The question is how. Next month we'll pick a victim volunteer to research Poetry, Pipenv, and pip-tools and make a recommendation.
We'd likely benefit from proactively simplifying setup.py
and synapse/python_dependencies.py
in advance as Poetry replaces setuptools entirely, while pip-tools and Pipenv overlap with it.
(Braindump) A few questions to ask of a given solution:
psycopg2>=2.8;platform_python_implementation!='PyPy'
?
lint
, dev
, etc?Also worth noting that PyCharm supports both Pipenv and Poetry
Other writings on the topic of Pipenv vs Poetry:
The pipenv lock command does not capture dependencies of Python wheels provided in a Pipfile. This can result in non-deterministic builds in which each run may not pull in the same dependencies, causing inconsistencies between runtime environments.
While reading around I also came across pyflow, micropipenv and pip-tools.
Frankly I feel paralysed by too much choice. I'll try to see if I can summarise this into something digestible.
Frankly I feel paralysed by too much choice. I'll try to see if I can summarise this into something digestible.
If integrating with GitHub's Depenabot security stuff is a hard requirement (and it might as well be), our only choices are Poetry, Pipenv, and pip-tools. I'd recommend stepping back from others' opinions and blog posts and just diving in on your own:
Don't worry about CI or anything... just get a feel for how each tool works, how its docs are, etc.
After, go and try to answer the questions from this comment for each.
Right, I've started at a lot of the above and web-pages. I concentrated on the pip-tools, pipenv and poetry because they were dependabot supported. There's an info dump coming up, but TL;DR:
--help
alone, I have a slight preference for pip-tools.Agree it needs to be put into practice.
Requirement | pip-tools | pipenv | poetry |
---|---|---|---|
new venv from lock | pip-sync requirements.txt [...] |
pipenv install |
poetry install |
checksumming | hashes checked by pip |
automatic, but not sure of details | automatic I think |
add dependency | add to requirements.in ; pip-compile |
pipenv install PACKAGE~=VER |
poetry add PACKAGE |
edit dependency bound | edit requirements.in ; pip-compile ; |
edit Pipfile and pipenv lock , or pipenv install |
edit pyproject.toml , poetry lock ; or poetry add |
get newest package ver | pip-compile -P PACKAGE |
pipenv update PACKAGE |
poetry update PACKAGE |
remove dependency | remove from requirements.in ; pip-compile |
pipenv uninstall PACKAGE |
poetry remove PACKAGE |
lock with checksum | pip-compile --generate-hashes |
pipenv lock |
poetry lock |
editable installations | Unsure; -e . in requirements.in , but bugs? |
pipenv install -e . |
editable by default |
minimum ver while pinning | ✓ | ✓ | ✓ |
environment markers | run once per environment, multiple requirements.txt |
own syntax in Pipfile. Unsure how it's treated in lock. | supported, dunno about lock |
Multiple dep categories | pip-compile --extra GROUPNAME reads setup.py; multiple infiles? |
Just --dev and --dev-only in various cmds |
supported, dunno about lock |
Detect newer versions | pip-compile --dry-run and diff? |
pipenv update --outdated |
poetry show --outdated |
Detect security problems | none, pipe to safety | pipenv check ; uses Pyup safety |
none, pipe to safety |
Dependabot support | 6.1.0 | <= 2021-05-29 | v1 |
Explain given dependency | pip-compile --annotate |
pipenv graph |
poetry show --tree |
PyCharm support | None, just requirements.txt | Yes | Yes |
Override bad dep metadata | Unsure--what does pip allow? | Unsure | No. Called out in HN comment. |
Dry run | pip-compile -n ; pip-sync -n |
pipenv update --dry-run |
--dry-run everywhere |
Conflict resolution | Defers to pip |
Unsure | Home-grown, claimed as better. |
Venv access | normal venv? | pipenv run or pipenv shell |
poetry run or poetry shell |
Other | Can pin distribute , pip , setuptools |
Historical drama, inactivity; | build and publish shortcuts |
"Ideally, you should only have one target Python version." | Trying to be all-in-one? | ||
Unsure how to expose depencies on PyPI | Seems trendier/more active/more favoured? | ||
Custom Pipfile syntax? | Manages multiple venvs? |
Ultimately I think we could choose any one of these and there's not much between them for our purposes. On the face of it, I think I'd recommend pip-tools > poetry > pipenv
. Rationale:
Pip-tools pro:
pip-tools con:
dev
test
lint
environments, I think we'll need to have separate inputs and output requirements files which refer to each other.Poetry pro:
--help
to pipenv'sPoetry cons:
does poetry make setup.py redundant? does it aim to?
Yes to both; in addition to managing environments, Poetry is a complete replacement for Setuptools. This includes related build + publishing tools (twine, build, etc.).
Pipenv and pip-tools are concerned only with installing dependencies; they want nothing to do with packaging. (Pipenv also manages venvs for you, pip-tools expects you to manage your own venvs).
Detect newer versions [...] poetry update --dry-run
I'll also note that poetry show --outdated
does an excellent job of summarizing the state of the world:
The red vs yellow coloring is based on whether the upgrade would be "safe" according to semver (e.g., the leftmost part of the version didn't change). This makes it much easier to keep up to date if manually updating things: the versions in red are likely to be the quickest / easiest to knock out, while the yellow ones might have backwards compatibility concerns and thus take more work.
https://github.com/DMRobertson/test-packaging has a branch for each tool under consideration.
Right, here's wot I think.
Of the three tools supported by dependabot:
requirements.txt
, so in the worst case scenario we can reuse the lock files with pip
or pip-tools
.I reckon we should use poetry because its lock file is comprehensive, it manages a single source of truth, and has pretty good documentation with a nice CLI UI. pip-tools is more awkward if you want to make use of environment markers. pipenv left me confused: its interplay with setuptools was unclear, I think partly because pipenv is intended for managing application environments, not libraries. Also, many of pipenv's commands lack a --dry-run
option.
In more detail:
To start, I want to complain about pipenv. AFAICS it's designed for wrangling applications alone: a single set of fixed packages and versions, ideally running on a single fixed interpreter on a fixed host. If we were only supporting our debian packages and docker images, that'd be fine, but... we want to give our community packagers a reasonable range of lower bounds too.
Instead, for packaging a library (and publishing to PyPI?) you're expected to use existing setuptools infrastructure. Seeking an example, I went to see how pipenv itself used Pipfile and setup.py and setup.py. I was confused that
Perhaps pipenv will read from setup.py/cfg much like pip-tools
can? It must do, since the packages in setup.py appear in the lockfile. If so, this isn't clear from the documentation. And while I'm at it, generally speaking I found their documentation to be more of a tutorial than a reference; I found it hard to use it to answer my questions.
Pros:
safety
check. Other tools have to export and pipe to safety
Cons:
setup.py
in a Pipfile.--dry-run
option.Other:
pipenv
help denote? no idea.Pip-tools was my initial preference. I thought it would be simplest. Given a virtualenv, you manually control the spec file, run one command to make the lock file, and then one command to update your virtualenv. Regrettably it took a bit of exploring to work a few things out. I had to opt-in to have the tool --generate-hashes
. It took me a moment to realise that pip-compile
can parse setup.cfg
(and maybe setup.py
)? Once I'd realised that, great---we have a single source of truth for our package requirements.
Unfortuntely the lockfile only caters to a single environment at a time. That means a separate lockfile for a dev environment, a separate one for a pypy environment, a separate one for a python 3.8-with-backport-exclusive-to-3.8, ... it seemed a little messy and the other tools could account for this.
I think the only other thing worth saying is that the CLI isn't as fancy. There isn't an easy way to query the state of the lockfile: what dependencies could be updated?
Pros:
Cons:
git diff
.pip install --user pip-tools
. (sync tried to nuke my system virtual env)If I'm being honest, by this time I was tired of the packaging swamp that Python has created for itself. Poetry covers the same ground as pipenv, but it's more ambitious. I think poetry is trying to be cargo
for the Python world.
One major difference: it eschews setuptools---the equivalent metadata has to go in pyproject.toml, and poetry itself will try to build the project. I had a quick look at our wheel build for synapse, and we don't seem to have any fancy C dependencies. I think this means that we're not doing anything clever in setup.py, and that using poetry instead should be fairly straightforward. (There's no reason we have to poetry for this, but we wouldn't want to maintain parallel config for two systems). I wasn't sure if things would get complicated if we wanted to build things in rust and link them with Py03. But it looks like https://github.com/pyo3/maturin#mixed-rustpython-projects
might be a different tool to use for this? I'm not sure.
The killer feature for me though was that the lockfile describes a locked version of dependencies for all environments. Take for instance this one:
[[package]]
name = "black"
#...
[package.dependencies]
#...
typing-extensions = [
{version = ">=3.10.0.0", markers = "python_version < \"3.10\""},
{version = "!=3.10.0.1", markers = "python_version >= \"3.10\""},
]
The CLI/UI is nice enough but basically comparable to pipenv. (I liked the more prevalent --dry-run
and interactive options). Documentation was a good enough reference, if a little barebones.
Pros:
--dry-run
everywhere.foo^=1.2.3
syntax for soft-pining to 1.2.3
or anything semver compatible. That's slightly more flexible than what pip supports (foo~=1.2.3
).Cons:
--help
menu. A little overwhelming.Other:
No-op lock + update venvs. Same requirements + the tool itself.
pip-compile setup.cfg --generate-hashes --extra dev,test,lint,meta -o requirements-dev.txt && pip-sync requirements-dev.txt
pipenv update --dev
, mostly in lockingpoetry lock && poetry install
I would really like the core developers to have sorted this out (c.f. cargo) so there isn't fragmentation. Live and learn, I suppose.
@DMRobertson It isn't 100% clear to me, but do any of these tools handle "extras"? It seems like for pip-tools
it would be completely manual, but does poetry? (E.g. right now you can pip install synapse[oidc]
-- how does this work with pinning?)
Thanks David — that was quite a good run-down.
Since I had to look it up: safety
check seems to refer to https://pypi.org/project/safety/ — a way to list known security vulnerabilities in your dependencies.
@DMRobertson It isn't 100% clear to me, but do any of these tools handle "extras"? It seems like for
pip-tools
it would be completely manual, but does poetry? (E.g. right now you canpip install synapse[oidc]
-- how does this work with pinning?)
There's syntax for this but let me work through an example to see how it works with a lock file.
This is a great write-up, thanks so much @DMRobertson!
I don't think it's sufficiently important that it should influence our decision, but currently synapse does a run-time check on all the dependencies, as a way to warn people if their virtualenvs are outdated (and to stop them reporting bugs caused by outdated deps). Is such a check possible under poetry
?
There's no expectation that Poetry is present at runtime; it's primarily used during development and packaging. Running poetry build
produces a normal Python wheel which includes requirements metadata in a standard format. You don't need Poetry to pip install synapse
.
Thus, if we want run-time checks, we should re-implement those functions from python_dependencies.py
in terms of importlib.metadata (backport) from the standard library. We should do this regardless (preferring standard approaches where possible), but we must do so if we move to Poetry, since it will replace our use of setuptools
and thus the information used by those functions won't be present where they expect it to be.
Dan beat me to it. But it's nice to finally understand why python_dependencies
exists as an executable python file in the first place.
If we do want to maintain the runtime checks, we may also find packaging to be a useful library for parsing / comparing version specifiers (setuptools itself vendors packaging
for this reason).
To address the recent batch of comments:
@DMRobertson It isn't 100% clear to me, but do any of these tools handle "extras"? It seems like for
pip-tools
it would be completely manual, but does poetry? (E.g. right now you canpip install synapse[oidc]
-- how does this work with pinning?)
"extras" are supported by poetry. (Not pipenv, as far as I can tell?) I believe that the pinning will pick a version and hashes for anything that could possibly be installed with any combination of extras, but I'll check this.
I've just checked that the lockfile contains dependencies for python versions or other environment markers that you're not using. So I'm hopeful the same will ring true of extras.
(I think the top of the lockfile represents a dump of poetry's solver's output---across all extras, environments, etc and that gets used at install time to work out what to install. The rest of the lockfile is then the hashes to guard the supply chain.)
currently synapse does a run-time check on all the dependencies, as a way to warn people if their virtualenvs are outdated (and to stop them reporting bugs caused by outdated deps). I Is such a check possible under poetry?
@richvdh poetry wouldn't be used for this at runtime as Dan says. I'm haven't seen a way to ask poetry "does this environment meet the constraints, even if it doesn't match with the locked file"? It's concerned with producing an environment that matches a lockfile exactly, so only ever seems to compare an environment against a lockfile.
In other news, I noticed today that that poetry makes it pretty convenient to switch between a venv for different python versions. That could occasionally be useful for tricky backwards compatibility stuff.
I am slightly anxious at the number of open issues/prs on poetry's repository. I think it might be prudent to pin the version of poetry we use in CI. (The same applies to whichever tool we use, but I think poetry is the one that has the most activity going on.)
believe that the pinning will pick a version and hashes for anything that could possibly be installed with any combination of extras, but I'll check this.
AFAICS this is true. I defined an extras group here. Lockfile included the new packages, but poetry install
didn't do anything until I asked for poetry install -E name_of_extra
.
Are there any outstanding questions, comments or concerns? If not I'd like to try converting signedjson
to use poetry and see how that goes.
Then try packaging a smaller library like signedjson.
Then try Sydent or Sygnal.
I had a go at this with signedjson, but I found myself a bit unsure about how to proceed with its status as a library. (We might want to pin deps in a library for a consistent dev environment, but probably not in CI?). I decided to drop it and try an application proper. That lead to the WIP PR https://github.com/matrix-org/sydent/pull/488. It uses a few building blocks, the lowest-level of which is https://github.com/matrix-org/setup-python-poetry/pull/1. Please take a look when possible.
It'll require some work to get the runtime checks in python_dependencies.py
working. Should hopefully be doable though: I can get my hand on versions using importlib.metadata.versions
and there's a requires
attribute on Distribution
objects which appears to contain the info we need.
It's not clear how one should best use tox
together with poetry
.
Personally, I never use tox
. I run mypy
, black
, isort
etc locally (via the lint script) and let CI do the rest. I am tempted to remove tox
, continue using the linter script locally, and have CI invoke mypy
and friends directly within a poetry environment. (Which worked fine for sydent.)
Do we agreement with the following? I propose that we want to be able to run
tests: across a representative range of debian/ubuntu distros and architectures
I don't think we do this currently? I don't know if it is particularly worth the cost of running all these extra CI runs.
tests: across a representative range of debian/ubuntu distros and architectures
I don't think we do this currently? I don't know if it is particularly worth the cost of running all these extra CI runs.
We do indirectly, via sytest. E.g. this PR ran sytest against Debian buster and testing, plus Ubuntu focal. I think that's a representative range; I'm not proposing we change this.
I don't think building the debs themselves runs the tests.
I don't think building the debs themselves runs the tests.
It runs the unit tests.
I don't think building the debs themselves runs the tests.
It runs the unit tests.
Can you point me at where this happens? I couldn't see this in build_debian.sh
. Is this part of dpkg-buildpackage
's work?
yup. let me know if you need more info on how the debian build fits together.
I am tempted to remove tox
I'm not averse to this, but as I wrote in the chat:
having a single command which will build a clean virtualenv with an arbitrary version of python, and run the tests, is useful.
Is there an equivalent in the poetry world?
(most of the other stuff that we do via tox - such as running the lint tools - is mostly for convenience: once we have a tox-shaped hammer, a bunch of other things look like nails.)
Is there an equivalent in the poetry world?
Not at present. There are building blocks (poetry env 3.8
to switch to/create the venv; poetry install --remove-untracked
for a clean install; one could define a script which could be invoked with poetry run
.)
Looks like things will be cleaner if we stick with tox
and follow the advice laid out here. It'll mean that instead of a single dev
environment that gets installed automatically, a contributor will have to install with poetry install -E all,mypy,test,lint
or similar. I'm a little sad to lose that, bit I guess that isn't any worse than the situation today.
Looks like things will be cleaner if we stick with
tox
and follow the advice laid out here.
I've played around with this for a day or so and found it hard to make progress. In summary:
When tox invokes pip to install the the local directory, this fails with
ERROR: File "setup.py" or "setup.cfg" not found. Directory cannot be installed in editable mode: /home/runner/work/synapse/synapse (A "pyproject.toml" file was found, but editable mode currently requires a setuptools-based build.)
I think https://github.com/python-poetry/poetry-core/pull/182 would help here, but there's no stable release with that change yet.
I agree that it would be sad to lose this:
having a single command which will build a clean virtualenv with an arbitrary version of python, and run the tests, is useful.
But I think I'd rather write a short script to do this rather than spend longer getting tox and poetry to cooperate.
some more context on tox
/poetry
's interaction:
My initial reaction was: "can tox just install things itself, without getting Poetry involved?" Setting aside the fact this wouldn't necessarily be useful (since you'd be testing something other than what gets used), Poetry requires that you list the project's dependencies within poetry-specific [tool.poetry.dependencies]
sections in pyproject.toml
, which means that the only viable tool for installing those dependencies is poetry itself. https://github.com/python-poetry/poetry/issues/3332 might address that in the future, but it doesn't seem to be going anywhere quickly.
So if you want to use tox, you have to get tox to do things the Poetry way.
Poetry requires that you list the project's dependencies within poetry-specific
[tool.poetry.dependencies]
sections inpyproject.toml
, which means that the only viable tool for installing those dependencies is poetry itself.
It's possible that tox could take advantage of the PEP517 information in pyproject.toml
to recognise that poetry is being used for the project. But honestly I don't really follow this web of PEPs.
Olivier mentions usedevelop = false
might make tox
more cooperative. I'll try it.
usedevelop = false
in combination with https://pypi.org/project/tox-poetry-dev-dependencies/
helps me make some progress. But my latest error doesn't inspire confidence.
TL;DR tox reads twisted-iocpsupport
from the lockfile. This is a dependency of twisted that is only marked required on windows. tox tries to build it from source (I'm not sure why; I think wheels are available) and this fails. I would guess that the plugin doesn't consider environment markers in the lockfile.
I want to jot down some steps that need doing.
python_dependencies.py
to read the version requirements using importlib.metadata
Most of this involves CI config faffing rather than application changes.
This partially made its way into Synapse 1.57 (which had a parallel pyproject.toml/poetry.lock and setup.py). Synapse 1.58 has removed the setup.py completely. I'm going to close this issue as done and file new ones for any final fixups.
See also the project https://github.com/orgs/matrix-org/projects/54.
Just tried to build v1.61.0 on Linux and Windows and both fail.
I know this worked a few weeks ago and I basically think that OP was right with the version pinning.
Motivation: I want to bisect and thus build v1.61.0 .. v1.62.0 for identifying a patch which destroyed the generate call for us (I'll open a new ticket for that one).
Here is the error:
git checkout tags/v1.61.0
DOCKER_BUILDKIT=1 docker build -f docker/Dockerfile -t fred .
[+] Building 76.9s (14/25)
=> [internal] load build definition from Dockerfile 0.1s
=> => transferring dockerfile: 5.28kB 0.0s
=> [internal] load .dockerignore 0.1s
=> => transferring context: 170B 0.0s
=> resolve image config for docker.io/docker/dockerfile:1 1.3s
=> [auth] docker/dockerfile:pull token for registry-1.docker.io 0.0s
=> docker-image://docker.io/docker/dockerfile:1@sha256:9ba7531bd80fb0a858632727cf7a112fbfd19b17e94c4e84ced81e24ef1a0dbc 1.0s
=> => resolve docker.io/docker/dockerfile:1@sha256:9ba7531bd80fb0a858632727cf7a112fbfd19b17e94c4e84ced81e24ef1a0dbc 0.0s
=> => sha256:9ba7531bd80fb0a858632727cf7a112fbfd19b17e94c4e84ced81e24ef1a0dbc 2.00kB / 2.00kB 0.0s
=> => sha256:ad87fb03593d1b71f9a1cfc1406c4aafcb253b1dabebf569768d6e6166836f34 528B / 528B 0.0s
=> => sha256:1e8a16826fd1c80a63fa6817a9c7284c94e40cded14a9b0d0d3722356efa47bd 2.37kB / 2.37kB 0.0s
=> => sha256:1328b32c40fca9bcf9d70d8eccb72eb873d1124d72dadce04db8badbe7b08546 9.94MB / 9.94MB 0.7s
=> => extracting sha256:1328b32c40fca9bcf9d70d8eccb72eb873d1124d72dadce04db8badbe7b08546 0.2s
=> [internal] load build definition from Dockerfile 0.0s
=> [internal] load .dockerignore 0.0s
=> [internal] load metadata for docker.io/library/python:3.9-slim 0.0s
=> [internal] load build context 0.3s
=> => transferring context: 7.41MB 0.2s
=> [stage-2 1/5] FROM docker.io/library/python:3.9-slim 0.2s
=> [builder 2/7] RUN --mount=type=cache,target=/var/cache/apt,sharing=locked --mount=type=cache,target=/var/lib/apt,sharing=locked apt-get update && apt-get install -y build-essential libffi-dev libjpeg-dev 51.4s
=> [stage-2 2/5] RUN --mount=type=cache,target=/var/cache/apt,sharing=locked --mount=type=cache,target=/var/lib/apt,sharing=locked apt-get update && apt-get install -y curl gosu libjpeg62-turbo libpq5 70.1s
=> [requirements 2/6] RUN --mount=type=cache,target=/var/cache/apt,sharing=locked --mount=type=cache,target=/var/lib/apt,sharing=locked apt-get update && apt-get install -y git && rm -rf /var/lib/apt/lists/* 60.6s
=> ERROR [requirements 3/6] RUN --mount=type=cache,target=/root/.cache/pip pip install --user "poetry-core==1.1.0a7" "git+https://github.com/python-poetry/poetry.git@fb13b3a676f476177f7937ffa480ee5cff9a90a5" 13.0s
------
> [requirements 3/6] RUN --mount=type=cache,target=/root/.cache/pip pip install --user "poetry-core==1.1.0a7" "git+https://github.com/python-poetry/poetry.git@fb13b3a676f476177f7937ffa480ee5cff9a90a5":
#13 3.856 Collecting git+https://github.com/python-poetry/poetry.git@fb13b3a676f476177f7937ffa480ee5cff9a90a5
#13 3.858 Cloning https://github.com/python-poetry/poetry.git (to revision fb13b3a676f476177f7937ffa480ee5cff9a90a5) to /tmp/pip-req-build-tufn8q4p
#13 3.866 Running command git clone --filter=blob:none --quiet https://github.com/python-poetry/poetry.git /tmp/pip-req-build-tufn8q4p
#13 7.242 Running command git rev-parse -q --verify 'sha^fb13b3a676f476177f7937ffa480ee5cff9a90a5'
#13 7.250 Running command git fetch -q https://github.com/python-poetry/poetry.git fb13b3a676f476177f7937ffa480ee5cff9a90a5
#13 7.839 Running command git checkout -q fb13b3a676f476177f7937ffa480ee5cff9a90a5
#13 9.253 Resolved https://github.com/python-poetry/poetry.git to commit fb13b3a676f476177f7937ffa480ee5cff9a90a5
#13 9.720 Installing build dependencies: started
#13 12.36 Installing build dependencies: finished with status 'done'
#13 12.37 Getting requirements to build wheel: started
#13 12.46 Getting requirements to build wheel: finished with status 'done'
#13 12.46 Preparing metadata (pyproject.toml): started
#13 12.77 Preparing metadata (pyproject.toml): finished with status 'error'
#13 12.77 error: subprocess-exited-with-error
#13 12.77
#13 12.77 × Preparing metadata (pyproject.toml) did not run successfully.
#13 12.77 │ exit code: 1
#13 12.77 ╰─> [26 lines of output]
#13 12.77 Traceback (most recent call last):
#13 12.77 File "/usr/local/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
#13 12.77 main()
#13 12.77 File "/usr/local/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
#13 12.77 json_out['return_val'] = hook(**hook_input['kwargs'])
#13 12.77 File "/usr/local/lib/python3.9/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 164, in prepare_metadata_for_build_wheel
#13 12.77 return hook(metadata_directory, config_settings)
#13 12.77 File "/tmp/pip-build-env-yp6fwb9j/overlay/lib/python3.9/site-packages/poetry/core/masonry/api.py", line 40, in prepare_metadata_for_build_wheel
#13 12.77 poetry = Factory().create_poetry(Path(".").resolve(), with_groups=False)
#13 12.77 File "/tmp/pip-build-env-yp6fwb9j/overlay/lib/python3.9/site-packages/poetry/core/factory.py", line 62, in create_poetry
#13 12.77 package = self.configure_package(
#13 12.77 File "/tmp/pip-build-env-yp6fwb9j/overlay/lib/python3.9/site-packages/poetry/core/factory.py", line 156, in configure_package
#13 12.77 cls._add_package_group_dependencies(
#13 12.77 File "/tmp/pip-build-env-yp6fwb9j/overlay/lib/python3.9/site-packages/poetry/core/factory.py", line 102, in _add_package_group_dependencies
#13 12.77 cls.create_dependency(
#13 12.77 File "/tmp/pip-build-env-yp6fwb9j/overlay/lib/python3.9/site-packages/poetry/core/factory.py", line 366, in create_dependency
#13 12.77 dependency = Dependency(name, constraint, groups=groups)
#13 12.77 File "/tmp/pip-build-env-yp6fwb9j/overlay/lib/python3.9/site-packages/poetry/core/packages/dependency.py", line 66, in __init__
#13 12.77 self.constraint = constraint # type: ignore[assignment]
#13 12.77 File "/tmp/pip-build-env-yp6fwb9j/overlay/lib/python3.9/site-packages/poetry/core/packages/dependency.py", line 110, in constraint
#13 12.77 self._constraint = parse_constraint(constraint)
#13 12.77 File "/tmp/pip-build-env-yp6fwb9j/overlay/lib/python3.9/site-packages/poetry/core/semver/helpers.py", line 31, in parse_constraint
#13 12.77 constraint_objects.append(parse_single_constraint(constraint))
#13 12.77 File "/tmp/pip-build-env-yp6fwb9j/overlay/lib/python3.9/site-packages/poetry/core/semver/helpers.py", line 147, in parse_single_constraint
#13 12.77 raise ParseConstraintError(f"Could not parse version constraint: {constraint}")
#13 12.77 poetry.core.semver.exceptions.ParseConstraintError: Could not parse version constraint: (>=20.4.3
#13 12.77 [end of output]
#13 12.77
#13 12.77 note: This error originates from a subprocess, and is likely not a problem with pip.
#13 12.78 error: metadata-generation-failed
#13 12.78
#13 12.78 × Encountered error while generating package metadata.
#13 12.78 ╰─> See above for output.
#13 12.78
#13 12.78 note: This is an issue with the package mentioned above, not pip.
#13 12.78 hint: See above for details.
#13 12.79 WARNING: You are using pip version 22.0.4; however, version 22.2.2 is available.
#13 12.79 You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.
------
executor failed running [/bin/sh -c pip install --user "poetry-core==1.1.0a7" "git+https://github.com/python-poetry/poetry.git@fb13b3a676f476177f7937ffa480ee5cff9a90a5"]: exit code: 1
qknight's issue was pulled out to #13849.
For completeness: there was some discussion about how to get tox and poetry to interact, starting roughly https://github.com/matrix-org/synapse/issues/11537#issuecomment-1047735769
I note that Poetry's docs (now?) has a section on tox configurations. TL;DR: if you want to get tox to install locked versions, you have to get it to invoke poetry.
Hi folks,
Right now most of Synapse's dependencies only declare a minimum version bound, like
Twisted>=18.9.0
(cite). This means that every time we build release artifacts or install Synapse from source, we unconditionally pull in the latest versions of our dependencies at that moment.This creates unnecessary risk, as our dependencies can change out from under us without warning. For example, installing Synapse 1.49.0 may work today but fail tomorrow if one of our dependencies releases a new version overnight.
This exact scenario bit us with the release of attrs 21.1.0 (synapse#9936) on May 6th. We were forced to release Synapse 1.33.1 less than a day after 1.33.0 as the attrs release broke our ability to install Synapse from source, build working Debian packages, or create functioning Docker images, even though nothing in our repositories had changed.
The absence of locked dependencies also impedes our ability to pursue continuous delivery, maintain LTS releases, or easily backport security fixes as we cannot recreate older release artifacts without also implicitly updating all of the dependencies included therein.
Definition of Done
Further Discussion / Constraints
Resolving this implies that it must be possible to enumerate exact versions of all dependencies included in any given upstream release of Synapse, using only a clone of the Synapse repository. This is important for auditing, as it allows us to easily answer questions like "did we ever ship a release with a vulnerable version of that dependency?"
Moreover, any solution must record hash digests to protect against remote tampering, such as with pip's hash-checking mode.
To ease maintenance burden (and avail of GitHub's supply chain security features), it would be nice if whatever solution we arrived at integrated with Dependabot. Supported package ecosystems for Python are
requirements.txt
(pip / pip-tools),pipfile.lock
(Pipenv), andpoetry.lock
(Poetry).