Closed nejucomo closed 7 years ago
Thanks for this ticket!
Immediate thoughts are that I think package signing at this stage in the game is premature as there are other avenues of very serious attacks available... However the proposed system is not really related strictly to signing. It could also be used to implement something like https://pypi.python.org/pypi/peep. So personally I'm going to think about this ticket a little bit before hand to figure out if I believe it's going to provide a useful feature without serious short comings in the near term.
This has another useful purpose too, companies or organizations could use it to disallow installing items that haven't been through a security audit or license review or what have you. For instance OpenStack could potentially use it to help ensure that an unapproved dependency isn't added.
A syntax like the following would be convenient:
pip install --verify-<sig> -e git+https://github.com/pypa/pip#egg=pip
...
These may be helpful for creating documentation on this feature and how it relates to other components of a secure python packaging process:
Source Repository GPG
Python Package GPG (./<package>.asc
)
http://pythonhosted.org/distlib/tutorial.html#verifying-signatures
For any archive downloaded from an index, you can retrieve any signature by just appending
.asc
to the path portion of the download URL for the archive, and downloading that.
Python Wheel JWS S/MIME (PEP 427)
Index Mirror DSA (PEP 381)
Package Signatures for .deb, .rpm, ...
gpgcheck
, localpkg_gpgcheck
, repo_gpgcheck
Python Package Configuration Management Systems
[Cryptographic] Hash Functions
seeAlso: #425 (this comment)
This has another useful purpose too, companies or organizations could use it to disallow installing items that haven't been through a security audit or license review or what have you. For instance OpenStack could potentially use it to help ensure that an unapproved dependency isn't added.
So would a use case be something like verifying a dependency graph of packages' checksums and metadata?
I'm pretty interested in implementing this. Should I knock up a strawman Pull Request?
So I've thought about this some more, and it's really started to grow on me.
Some thoughts on what I'd like to see:
I think the hook should be a python hook, that allows us to pass data about the thing we are trying to install into the hook easily, and receive more complex return types than pass/fail. If someone wants a simple call a command and the subprocess module is simple to use so the python portion of the hook in that case would be a small shim.
I think there needs to be more return types than Pass/Fail. In my mind there are four distinct return values. They are Pass, Warn, Retry, Fail. The defintions of them (again in my mind) would be:
Pass: The installation looks fine, go ahead and install it Warn: The installation is ok, but there is a warning that should be presented to the user (this one is possibly not needed and warning could be done via the logging system). Retry: This particular package is unsuitable, but pip can attempt to locate another package that fulfils this dependency (either from a different location, a different type, or a different version). Fail: This package is unsuitable. Pip should not attempt to satisfy it and should throw an error.
At least that's what I think :) I'd love a PR that implements this hook feature.
I think there needs to be more return types than Pass/Fail. In my mind there are four distinct return values. They are Pass, Warn, Retry, Fail. The defintions of them (again in my mind) would be:
Pass: The installation looks fine, go ahead and install it Warn: The installation is ok, but there is a warning that should be presented to the user (this one is possibly not needed and warning could be done via the logging system). Retry: This particular package is unsuitable, but pip can attempt to locate another package that fulfils this dependency (either from a different location, a different type, or a different version). Fail: This package is unsuitable. Pip should not attempt to satisfy it and should throw an error.
What are the os.exit()
codes for each of these?
So would a use case be something like verifying a dependency graph of packages' checksums and metadata?
Pip package lists are specified as requirement specifiers in requirements.txt files.
So, in order to verify a list (a topologically sorted dependency graph) of python packages required for an environment, it is/will/would_be necessary to determine the path to the .asc
file (for each/every/most package listed in a requirements description format).
I don't think it should be shelling out to executable by default. It should call a python function as a hook and use a python return value. If people want their particular instance of the hook to shell out that's a simple python wrapper that they can shell out on their own.
takes two inputs:
- a path to a local file, which is a newly downloaded package
- a URL which the file was retrieved from
source_url = "https://pypi.python.org/packages/source/p/pip/pip-1.3.1.tar.gz#md5=cbb27a191cebc58997c4da8513863153"
asc_url = "https://pypi.python.org/packages/source/p/pip/pip-1.3.1.tar.gz.asc#md5=cbb27a191cebc58997c4da8513863153"
pkg_file = "./path/to/pip-1.3.1.tar.gz"
asc_file = "./path/to/pip-1.3.1.tar.gz.asc"
def verify(pkg_file, asc_file, source_url, asc_url):
return [distlib].index.verify_signature(asc_file, pkg_file)
... http://pythonhosted.org/distlib/tutorial.html#verifying-signatures
Distlib's Signature support is inherently broken. You cannot just pipe out to GPG and trust whatever keys are in the trustdb. Just because you trust me for X does not mean you trust me for Y.
[Distlib Signature Support]
So remove [distlib].index
? I guess the question I was trying to ask was: what is the minimal python function call signature necessary to most correctly verify what it is we are trying to verify here.
I prefer a hook api where the config specifies a path to an executable in pip's config file.
http://stevedore.readthedocs.org/en/latest/ may be useful for adding hooks / plugins / extension points and/or as a reference for [setuptools entry_point configuration]
Mercurial hooks and extensions pass something like a context dict
instead of positional arguments as with the verify()
interface listed above.
The inputs are passed as commandline arguments to a subprocess which invokes that command.
How and when should I sanitize this input? What is the best way to specify the command arguments?
cmd = ("bash",string_downlaoded_from_the_internets) , shell=False
# NOT
cmd = "bash %s" % string_downlaoded_from_the_internets
The hook's stdout & stderr are the same as the parent pip process. The exit status is 0 to indicate "accept package" and non-zero to indicate "reject package".
From a shell script, is there then a way to differentiate between failed and sig-check-failed for an install -r requirements.txt
?
I would be in favor of either and/or both:
verify()
: e.g. verify(pkg_file, asc_file, source_url, asc_url)
dict
keyset for the argspec parameters: e.g. dict.fromkeys(('pkg_file', 'asc_file', 'source_url', 'asc_url'), None)
... These were the stevedore documentation links I was looking for:
- Enabled through Installation
- Enabled explicitly
- Self-Enabled
Is the package signature hook called for .zip, .egg, and .whl packages AND for editable distributions?
There are new metadata attributes for package source locations.
# PEP 345 Metadata 1.2
download_url = str
# PEP 426 Metadata 2.0
source_url = {
'key': http_path,
'key_2': 'git+https://editable/path@version'
}
Distlib's Signature support is inherently broken. You cannot just pipe out to GPG and trust whatever keys are in the trustdb.
How so? You can specify the keystore to use. If necessary to support a potentially different keystore for each file, this could be accommodated via an extra argument to the verify_signature
method. This is an incremental change to the API to make it more convenient, but can you explain why you think it's inherently broken?
Because throwing cryptography at a problem without providing a solution to the actual problem does do anything. Your solution uses gpg, GPG has a built in trust model which doesn't work for PyPI style packaging where it's a free for all. GPG web of trust validates identity, but it doesn't validate that a person is alllowed to sign for a particular file. You say that you can just point to a different trustdb in that case, but that still doesn't solve the underlying problem of how something gets into the trustdb to begin with.
Implementing packaging signing needs to start with a proper trust model, just slapping some crypto on top of it doesn't solve the problem.
I see what you mean, but how something gets into the trust database is not really up to distlib
to solve. To do things properly you need something like a web of trust - the distlib
approach can still work in specific environments and scenarios for some people / organisations. No one piece of software can solve the trust problem, and it's not up to low-level software like distlib
to determine which keys are trusted (that would be policy, not mechanism). Providing a piece of the puzzle is not "throwing cryptography at a problem" - it's more like "if you have keys you trust, then distlib
provides a straightforward way of verifying signatures".
Keys you trust for what?
Keys you trust for what?
A key you trust to verify the signature of a specific package you downloaded. This will be the package publisher's public key (the corresponding private key having been used by the publisher to sign the package you downloaded), which you will have obtained through some trusted channel (so that you know the key belongs to the publisher, rather than someone claiming to be the publisher). This is easier said than done, but certainly doable for specific packages and publishers, with their cooperation.
Ok, so what's the mechanism of specifying that a certain key is only trusted for a certain package?
Ok, so what's the mechanism of specifying that a certain key is only trusted for a certain package?
- Get the key you trust for a package into a GPG keystore in directory
/path/to/keys
.- If
index
is an instance ofdistlib.index.PackageIndex
, doindex.gpg_home = '/path/to/keys'
.- Ensure that you have downloaded the archive and signature for the package to e.g.
/path/to/package.tar.gz
and/path/to/package.tar.gz.asc
.- Call
index.verify_signature('/path/to/package.tar.gz.asc', '/path/to/package.tar.gz')
Implementing packaging signing needs to start with a proper trust model, just slapping some crypto on top of it doesn't solve the problem.
.
Ok, so what's the mechanism of specifying that a certain key is only trusted for a certain package?
.
- Get the key you trust for a package into a GPG keystore in directory /path/to/keys.
So the trust model must include a mechanism for specifying which keys are valid for which packages?
- Get the key you trust for a package into a GPG keystore in directory /path/to/keys.
See, this is the entirety of the hard part of the problem domain, but you've neatly tucked it away in a single sentence. Actual signing and verifying has been easy for the past decade. It's so mechanically easy it's hardly worth implementing (and possibly even dangerous to do so, as you may give users a false sense of security) until you have a rigorous design for problem number 1, how do I get keys for people I trust and how to I decide what the heck I trust them with, and when, and for what.
I'd liken implementing package signing and verification without a well-thought identity, ownership and trust model overlying it, to implementing SSL in a browser without a PKI or certificate verification.
.2. Key <-> Package mappings
Is this a signed graph with typed edges?
With SSL, certs are tied to DNS (technically "Common Name") identifiers.
Not all packages are on PyPi, so a PyPi URN wouldn't solve for as many cases as just mapping Keys to Package URIs with 'types' (or 'roles'?): {"committer", [...], "-er" }.
To me, this seems like a useful metadata requirement to impose upon software project teams.
Could such "Key <-> Package mappings" metadata be inligned (topologically) with checksums in requirements.txt
and/or requirements.txt.lock
files (like peep)? Are there cert store formats which can store (package_uri, role, key) tuples?
[EDIT]
Get the key you trust for a package into a GPG keystore in directory /path/to/keys.
See, this is the entirety of the hard part of the problem domain, but you've neatly tucked it away in a single sentence.
Why write a screed when a short sentence will do? This has been discussed elsewhere many times.
Actual signing and verifying has been easy for the past decade. It's so mechanically easy it's hardly worth implementing
Well, I've implemented it for my own use, and others can use that implementation or not, just as they choose :-)
(and possibly even dangerous to do so, as you may give users a false sense of security) until you have a rigorous design for problem number 1
You're saying you shouldn't provide a solution for some people unless you provide a solution for everyone? I don't agree with this argument - it's a bit like saying PKI shouldn't have been invented at all, or that C shouldn't have been invented until the problem of buffer overflow exploits was solved ;-) There are scenarios where one can obtain and use trusted keys, and I have used PKI and GnuPG successfully in such scenarios. And a "false sense of security" can even bite seasoned security pros - just look at all the exploits around SSL - but that doesn't mean we should have nothing in its place.
It's worth noting that the complexity of the trust problem for package distribution is the main reason http://www.python.org/dev/peps/pep-0458/ and "The Update Framework" itself exist.
In relation to idea of "implement a hook that assumes any already verified GPG trust DB", well that's the same reason I signed off on Daniel's embedded signature support in PEP 427 - he had a constrained environment where he wanted to use that feature, and it was easy enough for everyone else to just ignore. Same goes for folks that have sorted out their GPG trust issues.
As far as this issue goes, +1 from me for the notion of making the verification step pluggable - we just need to be careful how those plugins get configured, because indirect attack vectors are always fun for all involved :)
As there is no native support available I am using a workaround based on Verifying PyPI and Conda Packages for my packages. Examples: yaml4rst, hlc
From https://github.com/blockchain-certificates/cert-schema/issues/25#issuecomment-282571524 :
A specified set of cryptographic primitives typically consisting of a canonicalization algorithm, a message digest algorithm, and a signature algorithm that are bundled together by cryptographers for developers for the purposes of safety and convenience.
At a minimum, a signature suite must have the following attributes:
A signature set is useful when the same data needs to be signed by multiple entities, but where the order of signatures does not matter, such as in the case of a set of signatures on a contract. A signature set, which has no order, is represented by associating a set of signatures with the signature key in a document.
A signature chain is useful when the same data needs to be signed by multiple entities and the order of when the signatures occurred matters, such as in the case of a notary counter-signing a signature that had been created on a document. A signature chain, where order must be preserved, is represented by associating an ordered list of signatures with the signatureChain key in a document.
# EXAMPLE 4: A signature chain in a Linked Data document
{
"@context": "https://w3id.org/identity/v1",
"title": "Hello World!",
"signatureChain": [{
"type": "RsaSignature2015",
"creator": "http://example.com/i/pat/keys/5",
"created": "2011-09-23T20:21:34Z",
"domain": "example.org",
"nonce": "2bbgh3dgjg2302d-d2b3gi423d42",
"signatureValue": "OGQzNGVkMzVm4NTIyZTkZDY...NmExMgoYzI43Q3ODIyOWM32NjI="
}, {
"type": "RsaSignature2015",
"creator": "http://bank.example.com/notary/keys/7f3j",
"created": "2011-09-23T20:24:12Z",
"domain": "example.org",
"nonce": "83jj4hd62j49gk38",
"signatureValue": "yZTkZDYOGzNGVkMVm4NTIQz...M32NjINmExMDIyOWgoYzI43Q3O="
}]
}
# EXAMPLE 5: A complete example of a signature suite
{
"id": "https://w3id.org/security#LinkedDataSignature2015",
"type": "SignatureSuite",
"canonicalizationAlgorithm": "https://w3id.org/security#URDNA2015",
"digestAlgorithm": "http://example.com/digests#sha512",
"signatureAlgorithm": "http://www.w3.org/2000/09/xmldsig#rsa-sha256"
}
I'm going to close this, I don't think we're going to implement it (nor do I think we want to implement it) and TUF will provide a better mechanism for signed packages once that is implemented.
@dstufft We really need a feature like that nowadays. As you might have noticed multiple websites get compromised. Sample of handbrake. Users need to be able to verify the source via GPG to ensure no modifications in transit or on the server were made.
This is especially important as a lot of users use pip to download their python modules. Simply because they are not available on the operating system or just because lots of google posts suggest this. Especially because most of them suggest to install via sudo
and not via --user
. This is a very large attack vector without GPG source verification.
Please add an option for GPG verification and also suggest the user to verify the source if signatures are available (and display the fingerprint to the user).
It's almost certain there is not going to be an option to verify GPG signatures within pip. GPG signatures are practically worthless on their own unless you have a trust model (and the built in web of trust is not good enough) and any effort that goes into implementing a trust model around GPG that works for us would be better spent implementing TUF.
@dstufft you specify the trusted key in the install command as written above. And the website that requires to install those deps will also list the fingerprints of the signed sources. Then pip compares the provided fingerprints on the pypi server with the command line. This way a pypi server side hack will be noticed.
This is a general problem of crypto. But you cant excuse with the statement that its not 100% failsafe and gpg is not usable with this limitation. Its the best and only real solution we have to verify sources. And if you make it not too complicated for the usecases above its a fairly simple process.
On Thursday, May 18, 2017, Nico notifications@github.com wrote:
@dstufft https://github.com/dstufft you specify the trusted key in the install command as written above. And the website that requires to install those deps will also list the fingerprints of the signed sources. Then pip compares the provided fingerprints on the pypi server with the command line. This way a pypi server side hack will be noticed.
This is a general problem of crypto. But you cant excuse with the statement that its not 100% failsafe and gpg is not usable with this limitation. Its the best and only real solution we have to verify sources. And if you make it not too complicated for the usecases above its a fairly simple process.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/pypa/pip/issues/1035#issuecomment-302521698, or mute the thread https://github.com/notifications/unsubscribe-auth/AADGy_HRbRJ255PP0JPkonn5RCCLq0Ftks5r7KB9gaJpZM4AzGWz .
@NicoHood We're thoroughly skeptical of claims that this is in high demand or a major end user security concern, as we have zero commercial pip redistributors reporting sufficient customer demand for them to invest engineering time in improving the security model of the tooling. Instead, they either cache the published hashes, or cache entire artifacts, such that PyPI compromises after the initial release won't have any impact on them and their customers.
Similarly, publishers can detect any such post-publication compromises for themselves by maintaining a list of previously published hashes, and checking them against what PyPI is providing (or what redistributors are providing, for that matter - assuming they're republishing unmodified sources without applying any downstream patches).
Signatures are only useful as a way of verifying publishers, and GPG has no trust model to enable that in a useful form for an open platform like PyPI (this isn't like a Linux distro where you'd just be trusting the GPG key used in the distro's build system).
We're thoroughly skeptical of claims that this is in high demand or a major end user security concern, as we have zero commercial pip redistributors reporting sufficient customer demand for them to invest engineering time in improving the security model of the tooling. Instead, they either cache the published hashes, or cache entire artifacts, such that PyPI compromises after the initial release won't have any impact on them and their customers.
Is there demand for end-to-end security in a continuous deployment workflow?
...
hg sigs
(sigs
))hg sigcheck
(check
))
OS Packages
Similarly, publishers can detect any such post-publication compromises for themselves by maintaining a list of previously published hashes, and checking them against what PyPI is providing (or what redistributors are providing, for that matter - assuming they're republishing unmodified sources without applying any downstream patches).
diff -r $rev -r $rev_after_packaging_packages_are_applied
would be real nice.Signatures are only useful as a way of verifying publishers, and GPG has no trust model to enable that in a useful form for an open platform like PyPI (this isn't like a Linux distro where you'd just be trusting the GPG key used in the distro's build system).
What could solve for this?
Need: ACLs (project, (user, key), permissionstr)
WebAccessControl is a decentralized system for allowing different users and groups various forms of access to resources where users and groups are identified by HTTP URIs.
Need: Identity (PKI || BlockChain)
Need: Signatures
"signature": {
"type": ["MerkleProof2017", "Extension"],
"merkleRoot": "68f3ede17fdb67ffd4a5164b5687a71f9fbb68da803b803935720f2aa38f7728",
"targetHash": "c9ead76a54426b4ce4899bb921e48f5b55ea7592e5cee4460c86ebf4698ac3a6",
"proof": [{
"right": "7fef060cb17614fdfddd8c558e102fbb96433f5281e96c80f805459773e51163"
}],
"anchors": [{
"sourceId": "8623beadbc7877a9e20fb7f83eda6c1a1fc350171f0714ff6c6c4054018eb54d",
"type": "BTCOpReturn"
}]
}
Challenges:
@westurner You've been warned multiple times on multiple projects not to post random link dumps into tracker issues (and elsewhere). Please voluntarily refrain from doing so, so it doesn't need to escalate to another block.
Excuse yourself. I am offended.
You have not presented a solution.
Nor have you assisted other conversation participants with this type of security.
A signed ACL list in a DHT would certainly done solve a need for cryptographic signatures here. (Where, again, GPG does not solve for authorization).
On Friday, May 19, 2017, Nick Coghlan notifications@github.com wrote:
@westurner https://github.com/westurner You've been warned multiple times on multiple projects not to post random link dumps into tracker issues (and elsewhere). Please voluntarily refrain from doing so, so it doesn't need to escalate to another block.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/pypa/pip/issues/1035#issuecomment-302850374, or mute the thread https://github.com/notifications/unsubscribe-auth/AADGyzj5GmoZMdFG3onK8ZWHVPEZ1s_Vks5r7m8qgaJpZM4AzGWz .
@westurner We've had a defined technical solution to this problem for years, and Donald referred to it above: The Update Framework.
The details are covered in two PEPs:
This was also one of the key points of concern I raised in my overview of the state of Python packaging last year: http://www.curiousefficiency.org/posts/2016/09/python-packaging-ecosystem.html#making-pypi-security-independent-of-ssl-tls
It is not a technical problem now, and hasn't been since those PEPs were written. Throwing more technical ideas or evidence of unfunded demand at the PyPA developers does nothing to advance the situation.
Instead, it's a funding and sustainability problem, that requires folks either to lobby commercial redistributors to tackle this problem comprehensively on behalf of their customers, or else to make the case for why the PSF should fund this when vendors with a strong reputation for handling open source security management concerns on behalf of their customers decline to do so. Either way, the PyPA developers are not the right people to be directing any advocacy towards.
So, with TUF, IIUC:
I believe this is the correct issue in which to discuss this (and other out-of-band ways of validating software packages) because the question is specifically requesting a way to verify (signed) hashes.
On Saturday, May 20, 2017, Nick Coghlan notifications@github.com wrote:
@westurner https://github.com/westurner We've had a defined technical solution to this problem for years, and Donald referred to it above: The Update Framework.
The details are covered in two PEPs:
- PEP 458 -- Surviving a Compromise of PyPI https://www.python.org/dev/peps/pep-0458/
- PEP 480: Surviving a Compromise of PyPI: The Maximum Security Model https://www.python.org/dev/peps/pep-0480/
This was also one of the key points of concern I raised in my overview of the state of Python packaging last year: http://www.curiousefficiency. org/posts/2016/09/python-packaging-ecosystem.html#making-pypi-security- independent-of-ssl-tls
It is not a technical problem now, and hasn't been since those PEPs were written. Throwing more technical ideas or evidence of unfunded demand at the PyPA developers does nothing to advance the situation.
Instead, it's a funding and sustainability problem, that requires folks either to lobby commercial redistributors to tackle this problem comprehensively on behalf of their customers, or else to make the case for why the PSF should fund this when vendors with a strong reputation for handling open source security management concerns on behalf of their customers decline to do so. Either way, the PyPA developers are not the right people to be directing any advocacy towards.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/pypa/pip/issues/1035#issuecomment-302911719, or mute the thread https://github.com/notifications/unsubscribe-auth/AADGy6izGE4Gay9pb7Uir5eV496UNoZRks5r76hCgaJpZM4AzGWz .
pypi is then the SPOF
Can I use TUF with devpi instead of pypi/warehouse? (With centralized PKI)
On Sunday, May 21, 2017, Wes Turner wes.turner@gmail.com wrote:
So, with TUF, IIUC:
- that's centralized PKI
- pypi is then the SPOF?
- "ACL list" ~= root.json
- I would suggest ld-signatures as a future-proof standard for JSON document signing.
- was this written by the same person who choose to write Warehouse in pyramd? Thanks.
I believe this is the correct issue in which to discuss this (and other out-of-band ways of validating software packages) because the question is specifically requesting a way to verify (signed) hashes.
On Saturday, May 20, 2017, Nick Coghlan <notifications@github.com javascript:_e(%7B%7D,'cvml','notifications@github.com');> wrote:
@westurner https://github.com/westurner We've had a defined technical solution to this problem for years, and Donald referred to it above: The Update Framework.
The details are covered in two PEPs:
- PEP 458 -- Surviving a Compromise of PyPI https://www.python.org/dev/peps/pep-0458/
- PEP 480: Surviving a Compromise of PyPI: The Maximum Security Model https://www.python.org/dev/peps/pep-0480/
This was also one of the key points of concern I raised in my overview of the state of Python packaging last year: http://www.curiousefficiency.o rg/posts/2016/09/python-packaging-ecosystem.html#making- pypi-security-independent-of-ssl-tls
It is not a technical problem now, and hasn't been since those PEPs were written. Throwing more technical ideas or evidence of unfunded demand at the PyPA developers does nothing to advance the situation.
Instead, it's a funding and sustainability problem, that requires folks either to lobby commercial redistributors to tackle this problem comprehensively on behalf of their customers, or else to make the case for why the PSF should fund this when vendors with a strong reputation for handling open source security management concerns on behalf of their customers decline to do so. Either way, the PyPA developers are not the right people to be directing any advocacy towards.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/pypa/pip/issues/1035#issuecomment-302911719, or mute the thread https://github.com/notifications/unsubscribe-auth/AADGy6izGE4Gay9pb7Uir5eV496UNoZRks5r76hCgaJpZM4AzGWz .
Each repository is responsible for it's own security, so if you're using PyPI, then packages installed from PyPI derive their trust from a PyPI specific set of root keys. If you're using DevPI it will be up to DevPI to support TUF with it's own instance specific DevPI set of root keys. DevPI would/could validate the trust from PyPI before mirroring it onto DevPI and signing it itself.
This issue should be re-opened. I'm not asking for the system to be perfect I will download the gpg public keys for the packages I want to be able to install via pip. I simply want pip to only allow installation of packages that match those signatures. If someone changes the key (or removed it) it's my problem to figure out if the key was legitimately changed or if someone compromised the package. It's really no different that what I do for deb repos for example.
@rhuddleston If you're willing to trust the GPG key management practices of arbitrary publishers, then it's already entirely feasible to implement your own pip wrapper that adds the check you're seeking.
You don't need anyone's permission for that, and you certainly don't need to wait for hook support in the official pip client. (As a previous example of something like this, checking downloads against previously recorded hashes started out as a peep
feature, rather than as a pip
one)
But we're not going to recommend GPG as a general measure, because the web of trust model doesn't scale adequately for an open publishing platform with arbitrary publishers: it relies on the assumption that the signing keys are managed securely, and we simply don't agree that that's a well-founded assumption in the context of PyPI.
So instead of using (the not perfect) GPG you simply leave it as it is without any kind of verification?
No, we use the only verification we can currently meaningfully offer:
Unlike Linux distros, where GPG signatures provide assurance that the software you're installing was actually published by the distro, GPG signatures provide no meaningful assurance in the context of an open publication platform like PyPI - believing they do is only possible in the absence of clearly defined threat modelling that identifies the actors and actions you're aiming to defend against, and the kinds of trust you're aiming to enable.
It is possible to create a trust management system that would meaningfully improve the state of PyPI security by reducing the reliance on the HTTPS CA system for delivery assurance (see the links to PEP 458 and PEP 480 above), but "just add GPG!" isn't it.
I disagree that GPG signature checking is any more useless than distro GPG signature checking.
Should signing keys be distributed over a different channel than packages (HTTPS (TLS/SSL))? YES.
Other channels for GPG key distribution:
There needs to be a way to specify which keys are valid for which package; for both PyPi and distros.
How would providing a pip option to fail installation if GPG keys are absent/invalid provide any more of a false sense of security than failing if hashes don't match previously-admitted hashes (like peep)?
On Sunday, September 17, 2017, Nick Coghlan notifications@github.com wrote:
No, we use the only verification we can currently meaningfully offer:
- hash checking to ensure that previously downloaded artifacts don't change
- completely out-of-band signature checking that bypasses PyPI and the PyPA tooling entirely (as if you genuinely don't trust the PyPI admins, you can't trust any package signatures that PyPI publishes, nor any signature checking tools obtained from PyPI).
Unlike Linux distros, where GPG signatures provide assurance that the software you're installing was actually published by the distro, GPG signatures provide no meaningful assurance in the context of an open publication platform like PyPI - believing they do is only possible in the absence of clearly defined threat modelling that identifies the actors and actions you're aiming to defend against, and the kinds of trust you're aiming to enable.
It is possible to create a trust management system that would meaningfully improve the state of PyPI security by reducing the reliance on the HTTPS CA system for delivery assurance (see the links to PEP 458 and PEP 480 above), but "just add GPG!" isn't it.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/pypa/pip/issues/1035#issuecomment-330027613, or mute the thread https://github.com/notifications/unsubscribe-auth/AADGy_cZmeRxrHo79DD3WZVK9Ra8l0lSks5sjM-IgaJpZM4AzGWz .
To be clear, if end users correctly managed a trust store that mapped project names to GPG keys then it is fine and that would add an additional layer of security over what currently exists.
The issue is ultimately one of impact. Due to differences in the distro vs PyPI/pip case, we do not currently have the mechanism in place to automatically map projects to gpg keys, which means that end users will be responsible for doing this themselves. It is my opinion that the vast bulk of people will simply not bother and thus we will have added this feature for little benefit except for a minority of users.
Now, one could argue that adding a feature that a user can ignore doesn't cost them anything-- but in my opinion it does. It adds additional overhead in the things they need to understand in order to actually use pip, more things they need to weed through. On the maintenance side it also adds additional complexity which means that it's harder to test and develop and maintain pip in the long run, particularly for something that we're pretty sure we're not going to be using.
The other problem here is an ecosystem one. By providing a way to validate GPG keys we're implicitly telling people that they should be signing their packages with GPG, however we're already pretty sure that we're not going to be using that so it is effectively going to be making work for people that they're going to want to throw away at some point.
So what of GPG-signed distro repacks?
On Sunday, September 17, 2017, Donald Stufft notifications@github.com wrote:
To be clear, if end users correctly managed a trust store that mapped project names to GPG keys then it is fine and that would add an additional layer of security over what currently exists.
The issue is ultimately one of impact. Due to differences in the distro vs PyPI/pip case, we do not currently have the mechanism in place to automatically map projects to gpg keys, which means that end users will be responsible for doing this themselves. It is my opinion that the vast bulk of people will simply not bother and thus we will have added this feature for little benefit except for a minority of users.
Now, one could argue that adding a feature that a user can ignore doesn't cost them anything-- but in my opinion it does. It adds additional overhead in the things they need to understand in order to actually use pip, more things they need to weed through. On the maintenance side it also adds additional complexity which means that it's harder to test and develop and maintain pip in the long run, particularly for something that we're pretty sure we're not going to be using.
The other problem here is an ecosystem one. By providing a way to validate GPG keys we're implicitly telling people that they should be signing their packages with GPG, however we're already pretty sure that we're not going to be using that so it is effectively going to be making work for people that they're going to want to throw away at some point.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/pypa/pip/issues/1035#issuecomment-330061470, or mute the thread https://github.com/notifications/unsubscribe-auth/AADGy-XfjIjp2OGb-qqqdD6N5AvkXDI8ks5sjU0hgaJpZM4AzGWz .
The other problem here is an ecosystem one. By providing a way to validate GPG keys we're implicitly telling people that they should be signing their packages with GPG, however we're already pretty sure that we're not going to be using that so it is effectively going to be making work for people that they're going to want to throw away at some point.
I think that is a very important point. We do not need to create busywork for FOSS maintainers if it does not really improve things.
The reason GPG signing is effective in the Linux distro case is that the main purpose of it is for the publishers of the distro itself to ensure the integrity of the link from the distro's build system to end user installations of the distro even when that link traverses untrusted systems like public mirrors and the internet: the publishing system and the consumption system are controlled by the same entity, and you have to go through some form of review process to get access to the publishing end. The meaningful assurances of trustworthiness then come from the combination of GPG content signing and pre-publication review and publisher key management, not the content signing alone. (The Linux distros also take care of ensuring that GPG key management infrastructure is available and working for both publishers and consumers, whereas 'GPG will already be available and working' is an entirely invalid assumption on non-Linux systems)
Aside from the UX train wreck that is attempting to set up GPG signature checking on non-Linux systems, the key architectural differences in the PyPI case are that there is no pre-publication review process, and no standardised process for publisher key management. Adding only the GPG content signing part without addressing either of those aspects thus becomes purely a matter of security theatre, adding minimal value beyond the link integrity protection offered by HTTPS.
The lack of end-to-end signing support (outside the embedded signature support in the wheel file format) does mean that both the PyPI admins and the Fastly CDN admins constitute an "insider threat" for all consumers of content from PyPI. Now, it is possible for us to design and develop a system to inherently neutralise that threat (and PEP 480 describes one such system), but it's also possible to neutralise it through less mathematically sophisticated methods, like folks publishing expected artifact hashes through an independent registry, and publishers explicitly checking that the artifacts that PyPI publishes are the ones they uploaded.
However, effectively designing such a system requires people to actually define and document the threat model they're attempting to defend against, and choose the appropriate tools and techniques to provide the greatest increase in integrity assurance at the lowest cost in time and effort for publishers, infrastructure maintainers, and end users, rather than simply assuming that because a particular technique (i.e. GPG content signing) works well in the context of a Linux distribution, that same technique will be able to provide meaningful assurances in the context of an open publication platform like PyPI.
Synopsis
Some people want package signature verification during their pip installs. Other people think relying on authenticated package repository connections (such as over TLS) is sufficient for their needs.
Of those who want package signature verification, there is disagreement about how to tell PIP which signatures to trust (and how users will manage package signing public keys).
Rationale
The rationale for this ticket is to provide a mechanism in mainline pip for signature verification enthusiasts to experiment with different approaches. If a particular approach becomes popular, pip could consider incorporating that particular approach.
In the meantime, rather than have endless committee-style-arguments about how to do package verification, we should have a system that lets users choose for themselves, but only if they opt in.
Also, it keeps package verification cleanly separate from the pip codebase.
Criteria
This ticket may be marked as wontfix, or some other status to indicate that the pip developers reject this proposal.
This ticket may be marked closed, only when these conditions are met:
Implementation Details
I prefer a hook api where the config specifies a path to an executable in pip's config file. The inputs are passed as commandline arguments to a subprocess which invokes that command. The hook's stdout & stderr are the same as the parent pip process. The exit status is 0 to indicate "accept package" and non-zero to indicate "reject package".
-but I'd be happy with any system that fulfills the Criteria above.
Related Issues
Note, there is a less-well-specified ticket #425. I made this ticket because the vagueness of that ticket makes it difficult to close. (Is #425 satisfied by TLS authentication to package repositories based on a standard OS or user trust root? Does it imply or require package signature verification?)