w3c / did-rubric

W3C Decentralized Characteristics Rubric v1.0
https://w3c.github.io/did-rubric/
Other
14 stars 16 forks source link

Include references to external evaluations of DID Core and DID Methods #45

Open OR13 opened 2 years ago

OR13 commented 2 years ago

@peacekeeper can you add any details here?

I will provide links to other sources as they become available.

peacekeeper commented 2 years ago

We worked with a company called SBA Research on evaluating DID methods. They have two papers published at BPM 2021 blockchain forum:

The papers will be presented at BPM on Sept 6th-9th (not yet fixed).

Here is one link, but it's not free: https://link.springer.com/chapter/10.1007/978-3-030-85867-4_9.

I'll check if I can find a better link to this work that we can use.

jandrieu commented 2 years ago

Markus, I think the right option is to get their evaluations added through PRs.

One thing that @dhh1128 and I have on our plate is adding a section with additional source references for evaluations and evaluators. This should provide a good way to add their input to the W3C Note.

Please let them know we'd love to add their work, with proper citations, etc.

peacekeeper commented 2 years ago

Markus, I think the right option is to get their evaluations added through PRs.

What does this mean exactly? Do you mean adding their actual content to this repo here? Which files and in which format? Upload a PDF?

Their paper includes evaluations but is also more than that. It also contains insights about the DID Rubric, their experiences, implementation considerations, etc.

jandrieu commented 2 years ago

I mean to add example evaluations (and even new criteria) to index.html through normal github PRs.

Issue #46 (which I just created) will be the best place to track that. Once @dhh1128 or I extend the evaluation citations, it should be easy to add one-off examples.

peacekeeper commented 2 years ago

@jandrieu I like the idea of adding example evaluations directly to the document, and personally I'm very interested in ways of structuring evaluation reports so that they can be compared, perhaps even to the point where the reports can be machine-readable.

However, in our concrete evaluation project we have a Google doc, and we have a conference paper. Converting those to a specific form and adding that directly to the DID Rubric document is something that would take a long time and is simply not possible right now due to time and resource constraints.

So I created a PR that simply adds links to the existing (external) evaluation work: https://github.com/w3c/did-rubric/pull/47