I'm looking for a specification of a JSON implementation of DCAT. Is there a generic specification for the JSON DCAT that @ckan is using? I couldn't find it. What I found so far are:
generic (bureauCode & programCode are "not required")
used by data.gov.uk
Con:
no specification: the webpage is a list of changes from the US spec, without being systematic. Some field are defined based on what will be displayed by the data.gov.uk website
Seems to be supported by the OpenKnowledgeFoundation
Con:
I haven't been able to find a spec for it. Does such thing exist?
.
So here I am. If you have the link to a Google Doc or which ever webpage containing the specification of the format that this @ckan is parsing, you would be helping a lot!
(I'm pinging @NTerpo, @jpmckinney, @technickle, @rebeccawilliams, @philipashlock and @ColinMaudry because you commented a Gist related to this subject)
(And I'm pinging @barbeau, @antrim, @drewda, @caywood because we're working on this issue on our side)
Hi CKAN & DCAT communities!
I'm looking for a specification of a JSON implementation of DCAT. Is there a generic specification for the JSON DCAT that @ckan is using? I couldn't find it. What I found so far are:
US-specific JSON implementation, used by data.gov
[Source: https://project-open-data.cio.gov/v1.1/schema/]
Pro:
Con:
bureauCode
andprogramCode
UK JSON implementation, used by data.gov.uk
[Source: https://guidance.data.gov.uk/dcat_fields.html]
Pro:
bureauCode
&programCode
are "not required")Con:
CKAN harvester supported by OKF (aka this GitHub)
[CKAN source: https://extensions.ckan.org/extension/dcat/#json-dcat-harvester] [OKF source: http://spec.dataportals.org/]
Pro:
Con:
.
So here I am. If you have the link to a Google Doc or which ever webpage containing the specification of the format that this @ckan is parsing, you would be helping a lot!
(I'm pinging @NTerpo, @jpmckinney, @technickle, @rebeccawilliams, @philipashlock and @ColinMaudry because you commented a Gist related to this subject)
(And I'm pinging @barbeau, @antrim, @drewda, @caywood because we're working on this issue on our side)
Thanks!