w3c / vc-test-suite

Verifiable Credentials WG Test Suite
https://w3c.github.io/vc-test-suite/
BSD 3-Clause "New" or "Revised" License
69 stars 39 forks source link

any tests show interop between implementations? #99

Closed dckc closed 4 years ago

dckc commented 4 years ago

Are there any tests that show a credential issued by one of the implementations and checked by another?

Perhaps the answer is "all of them"? If so, are the credential documents checked in anywhere? If not, would you please share an example and explain how I could reproduce the results?

I'm trying to understand what sort of market is established by this spec. To what extent should I expect credentials issued, by, for example, Sovrin-Ken_Ebert to be verified by uPort and vice versa?

brentzundel commented 4 years ago

The test suite only checks adherence to the data model. As you are aware, this is only part of the system. Theoretically, any implementation that complies with the data model will be able to understand any other credential that complies with the data model. That is, the semantic meaning of each of the claims in the credential should be understandable.

Verification of these credentials is, unfortunately, another story. Because each credential implementation may use a completely different signing mechanism, there is no expectation that they will be interoperable in this way. Though there is great interest in the community to continue working toward credential exchange interoperability, this is not happening in the VCWG.

You might want to check out some of these community efforts:

dckc commented 4 years ago

... Because each credential implementation may use a completely different signing mechanism, there is no expectation that they will be interoperable in this way.

Am I reading that right? W3C charters a Verifiable Claims Working Group and doesn't think this sets an expectation of interoperability for verification of claims? Bizarre!

And yet the charter does seem to suggest some approaches to it are out of scope. But the problem statement in the primer pretty clearly sets expectations that this work is supposed to solve this problem:

Problem ... There is no interoperable standard capable of expressing and transmitting verifiable claims that works the same across industries (e.g., finance, retail, education, and healthcare). This leads to fragmented industry-specific solutions that are costly and inefficient.

This bit also seems to set an expectation of interoperable verification:

Why is a W3C standard necessary? Cross-industry interoperability.

cc @msporny

msporny commented 4 years ago

Are there any tests that show a credential issued by one of the implementations and checked by another?

Hey @dckc, since you're familiar with this space, here's the real answer -- there were standards politics at play that prevented the group from specifying one mandatory digital signature mechanism. The non-politics part of that discussion was based on a requirement that some digital signature schemes were good for some use cases and bad for others. If you put the digital signature mechanisms to the side, then yes, as @brentzundel noted above, the data model and syntax used to express that data model are interoperable... but that was too much for the initial charter (a vocal minority of W3C Members refused to go that far), so we have what we have today because we couldn't get consensus beyond the core data model and syntax. So, that's what the test suite checks and we do have multiple implementations that passed those tests:

https://w3c.github.io/vc-test-suite/implementations/#conformance-testing-results

Perhaps the answer is "all of them"? If so, are the credential documents checked in anywhere? If not, would you please share an example and explain how I could reproduce the results?

Instructions on running the test suite are here:

https://w3c.github.io/vc-test-suite/

I'm trying to understand what sort of market is established by this spec. To what extent should I expect credentials issued, by, for example, Sovrin-Ken_Ebert to be verified by uPort and vice versa?

Sovrin and uPort use incompatible signature formats, AFAIK... but the VCs themselves, if they follow the specification, are compatible. What this means is that you have to use an ecosystem specific library to check the signature, but once you do that, you can use a generalized processor to access data in the VC.

Not ideal, but the ecosystem continues to evolve, the VCWG is being rechartered, there is work to build out the Linked Data Signatures portion of the work... so... in short, work continues because the first iteration was successful and we hope to nail some of the other standards down in the second iteration.

Probably not the answer you wanted @dckc, but there it is. :)

dckc commented 4 years ago

So the concerns I'm raising were known at chartering time and the decision was to move forward anyway.

I'm tempted to object to the proposed rec anyway on the grounds that the title of the WG, if no the spec, sets expectations that it doesn't meet and hence this is a poor use of W3C's good name.

But I guess this isn't a hill I'm willing to die on. And I suppose there's a possibility that this spec will be part of catalyzing a functioning market. So... whatever.

dckc commented 4 years ago

p.s. My real issue is that interoperability between blockchain platforms involves a lot more than strings (DIDs) and JSON blobs (claims) and that W3C shouldn't be pretending otherwise.

I suspect ocaps and mobile code are necessary and sufficient... a la Agoric's JS smart contracts and IBC. But maybe data in this format will be useful in that setting in any case.

msporny commented 4 years ago

So the concerns I'm raising were known at chartering time and the decision was to move forward anyway.

The concerns were known and a way forward that moved us toward the goal achieved consensus. Was the charter or outcome ideal? No... but standards development is far from an ideal processes with ideal outcomes. Because of the work, the US Government (and other global governments) are pushing private industry to adopt the standard and demonstrate interop... there are other market forces at work here. US Customs and Border protection have telegraphed that any future systems solution should support many of the specs and standards (DIDs, VCs, etc.) coming out of W3C as a result of the work of the VCWG. We were definitely not in that position before the work started... very far from it.

Yes, all of us want this stuff to move faster and provide better interop than it does today... and getting stuff done at W3C is harder in some ways than it was even a decade ago... the membership is larger, more diverse, process improves, but consensus these days is harder because there are more of us involved in the work.

My real issue is that interoperability between blockchain platforms involves a lot more than strings (DIDs) and JSON blobs (claims) and that W3C shouldn't be pretending otherwise.

As you know, W3C isn't some monolithic entity... so, don't know who's doing the pretending. :)

We work on the problems that we can get consensus to work on and there is only as much interop as we were able to achieve consensus on. Remember that this work started with just a handful of people with an idea... there is always room for improvement and there is a community group of 300 people plus two working groups of 80+ people with 20+ active participants meeting every week to make that happen today.

... and all of that over the past 3 years... that's progress.