w3c / vc-data-model

W3C Verifiable Credentials v2.0 Specification
https://w3c.github.io/vc-data-model/
Other
287 stars 105 forks source link

Tracking Issue: JWT-only support #634

Closed burnburn closed 5 years ago

burnburn commented 5 years ago

This is a tracking issue to make sure a particular concern is clearly recorded. Duplicates will be closed.

@nadalin has raised a concern that only the JWT format provides a trusted and well-defined set of signature (proof) methods that can be used to address his interoperability concerns[1][2]. He has asked that we select JWTs as the only normative proof method[3] and require support by all implementations for JWT. The Working Group decided at its outset to remain flexible on proof method by defining a data model and, separately, one or more syntactic realizations of that data model and associated proof methods, expecting that other syntactic realizations and proof methods would be defined in the future. Dropping JSON-LD and the potential to support Linked Data Proofs in addition to JWTs is not acceptable to the group. In addition, there is a sizeable community that wishes to use zero-knowledge based proof systems, in a way not currently supported by JWTs, and therefore the existing flexibility must be retained.

[1] https://github.com/w3c/vc-data-model/issues/486 [2] https://github.com/w3c/vc-data-model/issues/489 [3] https://github.com/w3c/vc-data-model/issues/487

nadalin commented 5 years ago

@burnburn You can mandate that 1 form be implemented that does not mean that is the only form that one can implement so there is no issue with flexibility that you state

TallTed commented 5 years ago

@nadalin -

It seems no more (and, in fact, substantially less) appropriate to mandate implementation of support for JWT-based proof systems (which would prevent use of some zero-knowledge-based proof systems) than to mandate that JSON-LD processing be implemented (which we have not done for various reasons, including some you have raised yourself).

All the objections you have previously raised based on your (incorrect) perception of the latter (which would not have prevented any other functionality, and which misperception's sources I believe we have addressed), and many more, would seem to apply as much if not more to any actuality of the former, which is why we have not included such mandate in the spec of this data model.

nadalin commented 5 years ago

@TallTed This will come back up when interoperability is discussed.

nadalin commented 5 years ago

@TallTed There seems to be a serious misconception of MUST SUPPORT and MAY SUPPORT as no one is saying that JWT Proof is the only mechanism, there should be something that enhances the chance of interoperability, what is there now promotes zero interoperability

TallTed commented 5 years ago

@nadalin - The Data Model which we have specified allows for innovation and maturation in the proof mechanism arena, including but not limited to JWT/JWS, LD-Signatures, and Zero-Knowledge Proofs, as well as mechanisms which may not yet exist (or at least, are not yet known of by those of us now involved), and for any of these to become preferred by implementers at large in the design and implementation of VC Protocol(s).

With this current Data Model, at the discretion of their developers and implementers, systems that use VCs may be developed and implemented without concern for interoperability with any other systems or implementers. Other systems that use VCs may be developed and implemented with a goal of universal interoperability. Still other systems may be developed and implemented with an interoperability goal somewhere in between these extremes. All of these are possible with the Data Model as currently specified.

Systems which are primarily focused on JWS/JWT may not be fully interoperable with systems which are primarily focused on Zero-Knowledge Proof Systems, or on LD-Signatures. Systems which are primarily focused on Zero-Knowledge Proof Systems may not be fully interoperable with systems which are primarily focused on JWS/JWT, or on LD-Signatures. Systems which are primarily focused on LD-Signatures may not be fully interoperable with systems which are primarily focused on Zero-Knowledge Proof Systems, or on JWS/JWT. These are all OK using the Data Model as specified.

Systems which are primarily focused on any of those proof mechanisms may also be built to be fully interoperable with the others. Whether this is necessary, desirable, or otherwise, will depend on the VC ecosystem as a whole.

As far as I can tell from your sometimes rather cryptic writing (incomplete and run-on sentences, odd turns of phrase, minimalist issue names/descriptions), you are primarily concerned with a kind of interoperability which will only become a concern when there are some VC-using protocols added to the mix.

Addressing that kind of interoperability will of course be a focus of any groups who develop such protocols, whether W3 Working Groups or otherwise. However, as we have said many times in many ways in many issues (mostly opened by you), it is now, and has been for some time, beyond the capacity of this Working Group to produce such protocols, or to address that kind of interoperability.

I look forward to your full, early, and active participation in such future groups as may tackle the next phases of VC ecosystem development, including but not limited to the development and standardization of such Protocols, which will make use of this Data Model.

nadalin commented 5 years ago

@TallTed The approach with the data model first is that that in practice does not always work well when protocols are left out of scope, my concerns still remain on interoperability of just a data model. Is there a pointer to the test harness so I can see what is actually being tested?

TallTed commented 5 years ago

@nadalin - Surprisingly enough, the Verifiable Claims Data Model 1.0 Test Suite is linked from the Data Model ReadMe as well as from the Editor's Draft of the Data Model.

David-Chadwick commented 5 years ago

I think I understand @nadalin's concerns about interoperability. As the specification stands, we will have different un-interoperable islands of VCs, with each island supporting its own type of proof mechanism. This is OK if this is what we really want to happen. The perceived wisdom in this case is that the market will decide which island becomes the predominant one to rule the world, and all other islands will either follow suite or perish or become tiny isolated islands. The market is not necessarily the best driving force to determine the best technology c.f. VHS vs betamax. But this is what will happen give the current specification. If OTOH we say that all implementations must support at least one type of proof mechanism and may support other types as well, then we would ensure that all implementations can interoperate from day one. I am not advocating either approach. I am just trying to outline what I see as the differences in the two approaches, without getting religious about either.

burnburn commented 5 years ago

In the 16 July 2019 VCWG call:

RESOLVED: The Working Group has addressed all concerns outlined in issues #633 and #634 to the best of their ability, these are architectural decisions made by the group, and asserts that the specification reflects the consensus position of the Working Group and thus the issues should be closed.