uncefact / spec-untp

UN Transparency Protocol
https://uncefact.github.io/spec-untp/
GNU General Public License v3.0
18 stars 20 forks source link

Should we encourage issuers to leave terms undefined #117

Closed PatStLouis closed 2 months ago

PatStLouis commented 5 months ago

The @vocab attribute from the base context leads to https://www.w3.org/ns/credentials/issuer-dependent#, which reads:

There may be cases, however, when the creation of such formal vocabularies and context definitions is too much of a burden, such as for closed applications, experiments, or examples.

I don't think production grade systems should rely on this feature and the UNTP model isn't a closed application, experiment or example. The original intent of having an @vocab attribute was to facilitate development when entities didn't want to publish a context. There are other convenient means to do development.

@zachzeus @onthebreeze @nissimsan

nissimsan commented 5 months ago

@vocab is sort of a "catch all". So if all terms are indeed properly defined, it makes no difference whether @vocab is there or not - there is nothing left to be "caught".

nissimsan commented 5 months ago

One more thought, @PatStLouis, if we narrow the json schema properly, we can insure that all issuers use our context - and then it's up to the spec to ensure that terms are properly defined.

This is exactly what I intended to demo on the video I recorded last month: https://youtu.be/0wgWxhmKyzk

Fak3 commented 5 months ago

Default vocab was pushed into the VCv2 context under the pressure of plain json camp. It was expected to "simplify" development, but I believe it hurts interoperability, asking for keyword and full url term collisions for downstream developers of untp extensions. Recently the VC group discussion about the default vocab emerged again, and it may be removed in the future. (for good, imo) Meanwhile, I would vote we set @vocab : null at the top of our context file to unset the default vocab and prevent term collisions for our downstream users.

PatStLouis commented 4 months ago

@nissimsan thanks for sharing this video, I like the terminology of semantic overlay for coining the purpose linked data plays. Also very powerful at showcasing the value of a tightly defined schema.

The part that is not clear to me is how we will manage extensions, I think this will boil down to the process in place for someone to submit their extension at UNTP, a sort of UNTP extensions registry where context and schemas live.

I'm confident the core UNTP terms will be clearly defined, my concern was about party submitted extensions and what level of peer review the UNTP will have in this process. I know @zachzeus is working on some test tooling which will probably help with this.

onthebreeze commented 2 months ago

As @PatStLouis points out, the core UNTP specs will have complete context files so the @vocab property is redundant - so the questions really is about what rules UNTP will impose on "conformant" extensions. Do we want to say that UNTP extensions MUST not leave any terms undefined? and that testing for this is part of our UNTP extension registration process?

onthebreeze commented 2 months ago

In UNTP all terms are defined in context files and the schema's require use of the relevant context file. See examples at https://test.uncefact.org/vocabulary/untp/dpp/0/about