Closed OR13 closed 2 years ago
+1 this needs to be a blocker.
I agree the suites / key format deliverables should define how to represent private key material. I think everyone is onboard with this. IMO, this also shouldn't be an issue for multicodec (see: https://github.com/multiformats/multicodec/blob/master/table.csv#L148) nor a blocker to get the work started -- the WG will handle this.
Are there changes that need to be made in the charter text in order to enable the WG to do this?
I'm a -1 to this
My arguments have been made in this thread: https://github.com/w3c/vc-wg-charter/pull/51#discussion_r813480540
I'm a -1 to this
My arguments have been made in this thread: #51 (comment)
I don't buy into the idea that the WG should be defining key formats in the VCWG and that should remain separately abstracted from the signature formats.
Emphasis mine. @kdenhartog is this your contention? I agree. If we could rephrase -- the VCWG should only use normatively defined crypto suites, not define them itself.
@OR13 it is not clear to me what changes to the charter need to be made to address this issue.
It seems that perhaps an issue raised on the vc-data-model would be more appropriate so that the next WG can focus on this.
@OR13 wrote:
For suites relying on multicodec, this is an issue.
Why do you think this? Multikey defines both here:
https://w3c-ccg.github.io/data-integrity-spec/#multikey
@decentralgabe wrote:
the VCWG should only use normatively defined crypto suites, not define them itself.
This makes absolutely no sense. :)
You are making an argument to put the Data Integrity specification and all Cryptosuites out of scope for the WG. That is, the core reason we're rechartering the VCWG (to work on that stuff). You are also making an argument to not standardize the fundamental building block that did:key
is built upon -- that's multikey.
@OR13, @kdenhartog, @decentralgabe -- you each seem to be arguing for something completely different, can the three of you please get on the same page so that we can understand what this issue is about?
@msporny my only opinion is that the key representations we use should be standardized somewhere. I have little opinion on the bureaucracy and charters that enable that standardization. we need to standardize on these representations if we care to use them across the ecosystem.
@msporny my only opinion is that the key representations we use should be standardized somewhere.
Are you ok with one of these places being W3C? It is, after all, a global standards setting organization.
absolutely! I'm sure you have a better sense of the correct group in the W3C than I do.
@decentralgabe wrote:
I'm sure you have a better sense of the correct group in the W3C than I do.
Yes, that group is the VCWG, that's the class of work item (multikey and cryptosuites) it's currently being re-chartered to work on, with the oversight of IETF's Security groups. :)
See the current charter Deliverables under Verifiable Credential Data Integrity v1.0:
https://w3c.github.io/vc-wg-charter/#deliverables
Ok, so to wrap this up, previously, you said:
the VCWG should only use normatively defined key representations and crypto suites, not define them itself.
and then you said (paraphrasing) "I'm fine with the correct group at W3C defining those things."
I'm asserting that group is the VCWG (because that's where all the people using multikey and the cryptosuites are gathered right now and we're specifically liasing with IETF's Security Groups to do that work). To be clear, multikey is the only new key format we'll be defining, we're pulling in JWK, PEM, COSE, etc. from IETF.
So, I can only conclude that you agree with me at this point. Am I missing anything? :)
@kdenhartog wrote:
I'm a -1 to this
My arguments have been made in this thread: #51 (comment)
@kdenhartog can you open a new issue to track your concerns. It sounds like the issue title should be: "Do not define the multikey format in the VCWG". It sounds like what you're discussing and what @OR13 is mentioning are two different things, where @OR13 wants multikey defined in detail for both public and private keys, and you, @kdenhartog, do not want the VCWG to define it at all? Is that a proper interpretation of both of your positions?
@msporny thank you for the context and clarity. I agree with you.
To clarify a bit of what I'm after it's the following:
Number 1 is directly related to this issue where we're talking about the general abstraction model. Number 2 is a logical inference based on number 1 that affects charter text on whether we state we'll be normatively defining it because it's a specific instance of the general abstraction. Number 3 is to state the nuance that I do believe multikey is useful, but I don't believe this is the right place.
Hopefully that's a bit more clear, and I can further clarify if necessary.
Note - Currently a cryptosuite defines a "signature suite" and a "key format". I believe only the signature format is necessary and the key format should be removed from these specs.
- Cryptosuites shouldn't fully define key representations. We should only be defining signature formats for usage with the VC data model.
No, crypto suites should define signature and key formats, because both are necessary to define "sign" and "verify".
2. Therefore, we shouldn't be normatively defining something like multikey since it doesn't directly impact the VC data model. Since it doesn't directly impact the data model I also don't think we should even be normatively referencing multikey in the cryptosuites because I believe only signature formats need to be defined to process the proof.
No, for the same reason as above, but only if we decide to use signature suites that have to be defined here at W3C. (that are not defined sufficiently elsewhere).
I will formally object to defining a signature format here, without defining the associated public / private key format... everyone else should as well... it creates interoperability and security issues... it will cause real harm.
Notice that JWK and JWA refer to each other:
We should do them together, or not at all.
3. I do believe multikey should be defined elsewhere.
Maybe.
If W3C is going to define a signature format, it most define the key format that goes with it.
If W3C is going to define signature formats that are meant to be paired with multikey it must define multikey.
The current proposals are targeting publicKeyMultibase
... which is multikey.
The proposals could be rewritten to use JWA / JWK, or we could just use JsonWebSignature2020 and VC-JWT.
But thats not what the charter says currently.
If we are going to implement Ed25519Signature2020 with publicKeyMultibase... we better commit to doing that correctly, or not at all. Same is true for P-384 and Secp256k1.
because both are necessary to define "sign" and "verify".
Incorrect, both are necessary to specify an API and implementation for "sign" and "verify" which as far as I understand we're not doing. We're defining a data model for a signature format to integrity protect a credential. Case in point, Example 1 of the VC data model contains no key material or key parameters in the data model.
I will formally object to defining a signature format here
That's fine you'll have to do what you believe is correct but the threat of a formal objection doesn't deter me from arguing my point which I have formed based on my experience of building and evaluating a variety of different PKI systems and evaluating common attack patterns. I have no dog in this fight anymore, I'm not participating here because I have a product or service to offer or that I've already built and have to protect the interests of. I'm offering my expertise with no return value gained because I want to see this done right.
it creates interoperability and security issues
Please don't handwave here state specifically what issues you see here, so that we can specifically address them. The signature will fail to verify if the key is incorrect and the implementation is poorly implemented, so I see no legitimate security concerns here. As for interoperability concerns, we're not defining a protocol or ceremony in order to resolve a URI to key material or key parameters. That's the concern of DID Core or some other PKI system like X.509 certs, therefore I do not believe we should be redefining it here.
If W3C is going to define a signature format, it most define the key format that goes with it.
Not true, the convenience of defining them in the same place doesn't lead to the necessity of it being done that way.
If W3C is going to define signature formats that are meant to be paired with multikey it must define multikey.
That's my point, I'm arguing that I don't believe they should be paired in the first place. Furthermore, you do understand that multikey is a fairly poorly defined key format at this point right? It has no way of conveying purpose of the key material as a key parameter at this point in time which if left off does produce legitimate security concerns such as key reuse attacks.
The proposals could be rewritten to use JWA / JWK, or we could just use JsonWebSignature2020 and VC-JWT.
And in those cases I'd arguing for the separating as well as they already have been in the past. There's a reason JWK and JWS were separately defined. There's a reason that ECDSA and DER encoded keys are separately defined in NIST documents and not normatively coupled. There's a reason that BLS Signatures are specifically defined for the signature operation, but left silent about the key format in section 1.4 of the API. These are all documents that have been reviewed by 10s if not 100s of security experts with far more expertise then what we have in our entire community and not one has followed the pattern that you're stating and I've not found any errata raised to fix it because it was wrong. So where is the legitimate harm here?
If we are going to implement Ed25519Signature2020 with publicKeyMultibase... we better commit to doing that correctly, or not at all. Same is true for P-384 and Secp256k1.
This is a status quo bias. Just because things have been done this way in the past for this body of work doesn't mean it has to remain that way.
Data Integrity spec defines this already:
and pgp and jwk already have defined key formats... the risk is largely eliminated.
Having experienced converting both incorrectly from pem, binary, compressed, uncompressed, jwk and ethereum addresses... I can tell you its a huge mistake to "encourage" more folks to be doing that.
It also violates the direction we took with did core... we have 2 normatively defined formats... it would have been better if we had only 1... but we have 2... you are suggesting its a feature for the number to be unbounded.... I don't agree, but I am happy to hash that out in the wg.
Having experienced converting both incorrectly from pem, binary, compressed, uncompressed, jwk and ethereum addresses... I can tell you its a huge mistake to "encourage" more folks to be doing that.
It also violates the direction we took with did core... we have 2 normatively defined formats... it would have been better if we had only 1... but we have 2... you are suggesting its a feature for the number to be unbounded.... I don't agree, but I am happy to hash that out in the wg.
This is absolutely a topic for the WG - I also would prefer one format, but certainly am strongly opposed to an unbounded number of key formats or we will make verification of interoperability impossible, or will accidentally promote the use of a key format that introduces security risks inadvertently. Basically if we are defining in any way a signature format, we also have to define the key formats and use. note, in some cases that may be pointing to a key format defined normatively elsewhere, but if there is not an existing standard to point to, then we define it normatively
Data Integrity spec defines this already:
and pgp and jwk already have defined key formats... the risk is largely eliminated.
Having experienced converting both incorrectly from pem, binary, compressed, uncompressed, jwk and ethereum addresses... I can tell you its a huge mistake to "encourage" more folks to be doing that.
It also violates the direction we took with did core... we have 2 normatively defined formats... it would have been better if we had only 1... but we have 2... you are suggesting its a feature for the number to be unbounded.... I don't agree, but I am happy to hash that out in the wg.
Again this is a status quo bias that I don't find overly compelling. I feel like I'm talking in circles with the 3 of you. Nothing you're saying to me is compelling me to believe that what has been done is the correct way to do this going forward, nor that I'm hearing or seeing evidence of legitimate harms, and I continue to present additional evidence in light of repeated arguments from your perspective. I'm sure if I dug through all of your codebases I'd see repeated instances of this being done just as I saw with Digital Bazaar. If this is the wrong way to approach the problem why spend the time to write the code and actually merge it into the codebase?
This is absolutely a topic for the WG
This is a scoping discussion so it needs to happen during the chartering. If we legitimize this by requiring this text be normatively defined then in order to enter CR phase we'll be forced to normatively define this. It's my opinion it should not be normatively defined and therefore I believe now is the time to have the scoping discussion, not once it's already become a requirement of the WG.
@kdenhartog wrote:
If this is the wrong way to approach the problem why spend the time to write the code and actually merge it into the codebase?
There are at least three reasons, possibly more: 1) it was assumed that there were going to be adequate test suites to cover key conversion in time, or 2) the code was written in a rush without much thought put into the ramifications or testing, or 3) it was done with the assumption that JWK and Multikey were going to be the only two mechanisms (which seem somewhat manageable IF there are decent test suites covering conversions between the two formats).
You seem to be arguing for arbitrary conversions between all possible key types. As I noted here:
https://github.com/w3c/vc-wg-charter/pull/51#discussion_r816839686
I expect some in the group to push towards your position, and I can see the advantages of doing so. I won't say "there are zero problems with doing that" at present because we haven't been able to see what the security attack surface looks like if we fully decouple keys from cryptosuites. If we are going to allow arbitrary key translation, we need to have a very thorough test suite for doing that e.g., (PEM, JWK, Multikey, COSE) -> (PEM, JWK, Multikey, COSE) -- that's 12 code paths to convert from one format to another that need to be tested for the 46 key types listed in the JOSE registry alone, so... 552 tests (at a minimum). As far as I know, no one in history has written that test suite. In fact, there is no comprehensive JOSE test suite (like there are for W3C specs), and even the PEM->JWK or the JWK->COSE test suite doesn't exist, AFAIK.
It's do-able, but we're talking about requiring some pretty serious effort on that test suite alone. If you're volunteering to write that test suite and drive implementers to implement, that would go a long way to alleviate my concerns around arbitrary key translation. However, even if you volunteer to do that, you're starting to see others in the group push back against the concept of allowing arbitrary key representation formats to be usable in cryptosuites. I've floated this idea by people in the past and there was a pretty negative reaction to the concept. I'm happy for you to re-run the experiment.
This is a scoping discussion so it needs to happen during the chartering.
There seem to be two issues you're raising, @kdenhartog:
We don't have an issue for # 1 yet (or rather, it's clearly in scope based on a number of the input documents), and #2 is this issue. I suggest that if you want to also discuss # 1 above, you raise a separate issue for it.
@jyasskin I'd be curious where you fall on this topic as well as #97
You've argued before that things should be strongly defined as a general principle which is a decent argument for tying key formats to the proof formats (and what I interpret @OR13 to mean by "create interoperability issues"). However, my general take is that the extensibility to pair a signature format with a variety of different PKI systems in a way that allows them to be built on currently deployed PKI systems like the CA hierarchy is a rather appealing extension point to me since it would allow for a domains cert to be used to issue VCs.
Do you think this extension point steps beyond the reasonable degree of certainty you'd be looking for in this spec? More specifically would it lead to you formally object to the charter on behalf of Google if key formats were left undefined so that it could be leveraged as an extension point of this spec?
@kdenhartog thanks for keeping this issue alive, I think we are headed towards the right questions, and the answers need to be provided before we can consider the charter.
However, my general take is that the extensibility to pair a signature format with a variety of different PKI systems in a way that allows them to be built on currently deployed PKI systems like the CA hierarchy is a rather appealing extension point to me since it would allow for a domains cert to be used to issue VCs.
You can do this today... even without VCs, or any changes to any specs... You can always convert a domain cert key and store it in a well known jwks or a did document... here is a script that does that (I made it over a weekend, so there might be bugs).
There is a difference between "knowing how to convert a key" and requiring a verifier / RP to convert keys because the spec made that a legal thing.
did:example:123#key-0
might have a JWK, PEM, multicodec, Base58btc, CWK, PGP, TorKey, SSH-Key behind it, and you will need to convert that key representation to a JWK before using an off the shelf library to verify a JWT.
did:example:123#key-0
will have a JWK behind it when it appears as kid
in a JWS or JWT.
The normative statement that protects users should we go with option 1 is:
"The proof.verificationMethod or header.kid MUST dereference to a key representation that can be converted to a JWK when used to verify a JsonWebSignature2020 or a VC-JWT"
The normative statement that protects users should we go with option 2 is:
"The proof.verificationMethod or header.kid MUST dereference to a JWK when used to verify a JsonWebSignature2020 or a VC-JWT"
@jyasskin how many different key representations will Google support for issuing and verifying a JWT?
@selfissued how many different key representations will Microsoft support for issuing and verifying a JWT?
Will Microsoft or Google support "Key Representation Agility" when verifying a signature in a known format such as a JWT.
Take a look at https://github.com/decentralized-identity/did-jwt/pull/212#discussion_r821156934 to see the issues this is currently causing... particularly the implementation burden for "key conversion".
"The proof.verificationMethod or header.kid MUST dereference to a key representation that can be converted to a JWK when used to verify a JsonWebSignature2020 or a VC-JWT"
I'm fairly certain that Digital Bazaar would object to Option # 2 UNLESS there is a test suite that ensures that any implementation doing the conversion does the conversion correctly. Every company that I know of has screwed up key conversion at some point in its lifetime, so if we're going to normatively allow it, we need to make sure appropriate test suites are in place to do all the NxN tests.
Option # 1 would be abdicating our responsibility, IMHO, which is what just about every security specification does when it suggests key conversion, but doesn't supply a test suite to ensure that all these key conversions are implemented correctly.
You can always convert a domain cert key and store it in a well known jwks or a did document...
The issue is that by doing this conversion from one PKI system to another you're implying that the new PKI system has the same authorizations as the old PKI system. For example what if the CA chain was invalid, do you throw an error at the key conversion code path?
This is why I'm looking to not tie the crypto suite to a specific representation is because the PKI systems almost always have a built in authorization model (multikey2021 is the one format that doesn't and would produce harmful conversions like JWKs to multikey conversion because the "use" and "key_ops" parameters would get dropped). By choosing to specify PKI system you're choosing to accept not only the key representation most common to that system, but also it's authz model. This allows for the PKI system to essentially operate as a chosen method of operation for resolving key material and produces a very similar pattern as to what happens with DID methods where no implementation supports every did method, but it's free to pick and choose which methods it supports resolving. Just like with this model no signature suite implementation is required to support a specific PKI system, just the one it chooses to support. Which is where the merits of the interoperability issues come into play as a legitimate design tradeoff and why I was asking @jyasskin for his take because I know he's not been a fan of some of the design choices made in the past around DIDs.
Ultimately, this is a design choice which I think comes back to the web principles of "Put the user needs first" because the design choice is requiring us to specify not only a key format, but also an implied authz system and public key infrastructure in order to support a particular signature format. Whereas, the abstraction boundaries I'm looking to draw allow us to decouple the PKI system from the chosen format in a way that allow the user to depend on PKI systems that are already available which I believe produces a more scalable approach to using VCs on the web. We're meeting users where they're at today rather than saying "hey aren't VCs great? Oh by the way you probably need a DID to use them" rather than saying "VCs infrastructure can be built on top of common systems you already use today like that HSM to store your CA private keys". To me that's an incredibly appealing design choice that I think will lead to greater adoption of VCs overall.
It's choices like this which are going to make it more appealing for people like the ISO WG working on mDLs to consider switching away from their bespoke mDoc format (They are just one well known example - I'm sure there's other groups who would be faced with similar considerations) to a generalized VC data model as well since they won't get forced into supporting a new PKI system as well and can choose VCs while also keeping their ICAO style PKI model for now (and hopefully change to DIDs later).
I'm fairly certain that Digital Bazaar would object to Option # 2 UNLESS there is a test suite that ensures that any implementation doing the conversion does the conversion correctly.
Luckily we are required to write tests for all normative MUSTs.
Option # 1 would be abdicating our responsibility, IMHO, which is what just about every security specification does when it suggests key conversion, but doesn't supply a test suite to ensure that all these key conversions are implemented correctly.
I don't agree... Option 1 reflects a respect for existing code and standards.
To be frank, if we can't get consensus on this kind of thing, we should be dropping anything that might define a new key or signature format... we have JWK/JWS and CWK/CWS... the burden is on the proposers of new suites to explain how / why they are valuable / worth it... and right now, the proposals look harmful to interop.
I don't agree... Option 1 reflects a respect for existing code and standards.
You can't agree to the first statement and then disagree with the second (it's not logically coherent). I'm saying at least this, which I think you agree with -- "If you're going to do key conversion, and state it normatively, you need to have a test suite for it." Do you disagree with that statement?
To put it in other terms: Your assertion seems to be "We should normatively say that you can do key conversion AND we MUST NOT test for it." <--- seems to NOT be your position.
To be frank, if we can't get consensus on this kind of thing, we should be dropping anything that might define a new key or signature format.
Saying things like the above is not useful. You're making an argument for removing Data Integrity and multikey from the charter and I can guarantee you that the charter will be FO'd into oblivion if that happens. Please don't needlessly make statements that escalate things when we're so close to consensus.
@kdenhartog I haven't looked into this enough to have a strong opinion, but my instinct is that the WG would be wise to limit itself to using existing signature and public key formats that it can normatively reference. We’d then hope that the standards defining those formats would also define the matching private key formats. But (without having asked the security folks who'd actually make the decision) I don’t think Google would formally object to the charter if it merely allowed the WG to do the wrong thing here.
Note that there be dragons around everything related to defining new cryptography, including just defining a new serialization of cryptographic objects that someone else defined. If the WG tries to define something new and gets it wrong, or defines it ambiguously, that'd be a good reason to formally object to the final specification.
I realize it is a huge dump, but here is some stuff I've been working on because I have been working using P-256: https://github.com/decentralized-identity/did-jwt/pull/212 . For the most part, I used JWK => JsonWebKey2020. base58 was used deep in a test so I used something like'EcdsaSecp256r1VerificationKey2022'. PEM is in there, but not as a feature. It was so I could use an external library for a test that required PEM. My brain is kind of foggy. I'll look more at this thread in a bit.
@msporny
You're making an argument for removing Data Integrity and multikey from the charter and I can guarantee you that the charter will be FO'd into oblivion if that happens.
Why are you using threatening language? What would the Formal Objection be?
It seems very unusual that Data Integrity / multikey is even in scope for re-chartering here. I understand the VCWG output will have a dependency on those items, but it is clearly a unidirectional one: Data Integrity and multikey may be influenced by the needs of VCs, but are themselves independent and have utility well outside the scope of VCs.
Reading the current charter draft, my own understanding of the scope doesn't even include Data Integrity / multikey work, and in fact I would read it as excluding it (highlights mine):
In Scope: Algorithms for the expression and verification of proofs that use existing cryptographic primitives Out of Scope: The specification of new cryptographic primitives
Data Integrity and multikey, to me, are closer to new cryptographic primitives than they are to VCs.
I completely agree with this comment. Interoperable key representations are a min-bar requirement.
@quartzjer
Reading the current charter draft, my own understanding of the scope doesn't even include Data Integrity / multikey work,
The normative deliverables section describes:
Verifiable Credential Data Integrity (VCDI) 1.0 This family of specifications consists of documents that each define how to express proofs of integrity for verifiable credentials using a number of concrete serializations for each of the defined syntaxes. The specific set of concrete serializations included will be determined by the Working Group. The following are a non-exhaustive selection of expected input documents:
Container Formats: VC-JSON Web Token (JWT), Data Integrity
Cryptosuites: JSON Web Signature 2020, EdDSA, NIST ECDSA, Koblitz ECDSA
Why are you using threatening language? What would the Formal Objection be?
There is a non-trivial community in the W3C that is expecting this group to do the Data Integrity work, and we have achieved consensus around that item. That text exists in the charter today. It was hard won, over many years, so if people are going to come in and remove language that reflects that hard won consensus, I expect there to be formal objections (on the basis that the charter is not doing one of the fundamental things that the charter was promised to do).
A quick clarification for everyone here and in the future: "Cryptographic Primitives" refers to low-level cryptographic algorithms. Examples include SHA-256 and RSA. It DOES NOT refer to the container formats (or "profiles" aka cryptosuites) that wrap data or parameters related to those primitives that are mentioned in the charter here.
I think its helpful to be direct, sorry if my comment came across as threatening... But also I need to make my position clear and sometimes conditioning helps do that.
There are layers that we are talking about, and that are relevant to the charter.
For JOSE existing JWK / JWT... there are standards that you can and should follow... we will do a better job of mapping to them in v2.
For new proof formats such as "Data Integrity" / formerly known as "Linked Data Proofs", there are conventions that bind a single key type to a single signature type:
and
I created this issue before folks defined secretKeyMultibase
and indeed there remain key types for which no private key is registered here: https://github.com/multiformats/multicodec/blob/master/table.csv#L150, but we know at least some of these private keys could be represented... we also know the cost of not defining them sufficiently... https://github.com/digitalbazaar/ed25519-signature-2018-context/issues/5
The goal of the issue seems achieved... folks realize how painful key conversion issues are... they realize how bad it is when you define only public keys, and the consensus is still out on if the pain of key conversion should be built into the next W3C TR.... awareness has been raised.
I suggest we close this issue, since there are currently 0 proposed crypto suites that DO NOT define both public and private keys.
If you want to remove a crypto suite from the charter, use another issue to do that :)
I suggest we close this issue, since there are currently 0 proposed crypto suites that DO NOT define both public and private keys.
+1 to close the issue.
We decided to close this issue, based on the conversation above.
The issue was discussed in a meeting on 2022-03-23
Only CryptoSuites that fully define both public and private key representations should be accepted as potential normative deliverables.
For suites built on JWK or PGP this is not an issue.
For suites relying on multicodec, this is an issue.