opengeospatial / ogcapi-common

OGC API - Common provides those elements shared by most or all of the OGC API standards to ensure consistency across the family.
https://ogcapi.ogc.org/common
Other
45 stars 14 forks source link

Confirm compatibility with the use of JSON-LD #135

Open rob-metalinkage opened 4 years ago

rob-metalinkage commented 4 years ago

With the refactoring to a core that only specifies JSON, any issues of compatibility of GeoJSON and JSON-LD are removed.

It is my reading that using JSON-LD is now just an implementation choice over use of JSON. Maybe as simple as just adding a "@context" of your choice to reference your data model (or its binding to URIs) expressed as a JSON-LD context.

A JSON-LD profile could be defined if further constraints on what this context must be are needed - but for now it seems that it would be sufficient just to state that JSON in Common may support JSON-LD and JSON-LD implementation patterns are to be left to specialised profiles as future work.

cmheazel commented 4 years ago

I think it would be a mistake to overload the JSON conformance class with a JSON-LD encoding. If there is a requirement for JSON-LD, then let's define the necessary schema and create an additional conformance class.

akuckartz commented 4 years ago

I think it would be a mistake to overload the JSON conformance class with a JSON-LD encoding.

Why?

joanma747 commented 4 years ago

I agree that JSON-LD should be a separated conformance class. The problem is that no any JSON files can be converted to JSON-LD easily. IF we believe JSON-LS is important, it should be considered in the core as a design rule. For example, the issue #27 about link representations has implications on JSON-LD. The current representation of links does not migrate naturaly to JSON-LD.

rob-metalinkage commented 4 years ago

I am being agnostic as to whether JSON-LD needs to a separate conformance class - but the main issue is to make the core not destroy the possibility to use JSON-LD to make data a little bit self-describing.

@joanma747 highlights these implications. This issue is to confirm the compatibility of whatever core with the possibility of supporting JSON-LD, and the thus things like https://developers.google.com/search/docs/guides/sd-policies as well as numerous OGC activities exploring use of JSON-LD.

The alternative is making quite a large bet that OGC API wont be obsoleted or ignored by communities who want to describe their data better than anonymous private schemas.

dblodgett-usgs commented 4 years ago

@akuckartz -- in short, because there is a lot of baggage you have to carry around with JSON-LD. At its heart, it is RDF/OWL and using it properly is even more complicated because you have to cognitively traverse the JSON encoding of RDF.

That said, I do think it would be wise to keep core compatible with JSON-LD. Everyone wants richly described schemas and JSON-LD is the way that's done in Web engineering at this juncture.

Side note -- it's very interesting that google says "sd-policies" in the URL for the "General structured data guidelines" page.

joanma747 commented 4 years ago

The intention is not to make the core depended of JSON-LD. However, if there is something in the JSON encoding that prevents the JSON-LD transformation, please tell us exactly the issue and we will look for a solution.

rob-metalinkage commented 4 years ago

@dblodgett-usgs - what is the "baggage"? - JSON-LD can be as simple as having an @context link that points to context documents that describe each element of the JSON-schema.

Are you referring to the problem the RDF may be serialised to valid JSON-LD that does not match the original mandatory schema? I.e

this works: JSON-LD(CanonicalSchema+content) -> RDF
but this doesnt (RDF) -> JSON-LD(default schema using absolute URIs and no context, or inline context).

If so - is this actually just a problem with current JSON-LD tools that they doent allow you to serialise using canonical contexts? If that is so - i think that can be solved later as long as OGC API supports valid JSON-LD.

The potential problem occurs if you mandate JSON schema patterns that LD does not support. I'll take an action to try to derive a JSON-LD context for the current core - unless someone has already done this or claims to be a ninja with all the tooling ready to go. (if its python and available pass it on and we'll seek to get it integrated into the RDFlib package!)

rob-metalinkage commented 4 years ago

@joanma747 can you provide a reference here to material to guide what JSON structures are not convertible to JSON-LD - is this a checklist we can avoid - or do we engage with the JSON-LD community here and ask for guidance? I have project work committed to using JSON-LD and I want to find a pathway to allowing use of OGC specifications so can do some experiments or look at possibility of updates. work in OGC is widespread but somewhat hard to follow and I suspect you have one of the best overviews.

dblodgett-usgs commented 4 years ago

@rob-metalinkage I think you did a good job of demonstrating the kind of baggage. Specifically, I'm talking about things like (bot not limited to) JSON-LD keywords, particularly @id and @type that are constantly tripping people up. The typical developer who just wants JSON to work to emit transmit and parse information isn't going to be bothered to worry about 90% of the details you describe... they just want it to work.

I'm not saying to avoid compatibility -- I think that would be 100% a good idea -- I'm just saying that it adds several layers of complexity to something that is supposed to be dead simple... and there be dragons.

rob-metalinkage commented 4 years ago

@dblodgett-usgs - good point - its really JSON-LD = (any schema + context + ids) ->RDF except where the JSON is a single object whose ID can be identified out-of-band, in which case you dont need ids.

Anyway - to keep things simple its better to make it compatible - so can add LD annotations in future than having to include a special set of instructions for how to go about creating a different API that is equivalent to OGC-API but slightly different to all it to be LD enabled, and mechanisms to help identify which you are actually using as they will look and sound quite similar - it makes my head hurt to imagine having to have a parrallel set of functional equivalent APIs that arent simply an extension..

cportele commented 4 years ago

Where is "compatibility with JSON-LD" defined? JSON-LD itself does not list any requirements or recommendations for JSON structures. Instead, it says things like the following:

The syntax is designed to not disturb already deployed systems running on JSON, but provide a smooth upgrade path from JSON to JSON-LD. Since the shape of such data varies wildly, JSON-LD features mechanisms to reshape documents into a deterministic structure which simplifies their processing.

So, what is the real concern here?

Also note that this issue does not discuss the OGC API design in general, it is about the JSON encoding conformance classes in OGC API standards.

rob-metalinkage commented 4 years ago

I will define "compatibility as" - no JSON structures that cannot be encoded as JSON-LD by addition of an appropriate @context link. This is merely something to check i think now GeoJSON is not mandatory in common.

I would also suggest an informative note stating that Common may be extended with a profile for JSON-LD so its clear there is a pathway you dont need to reject OGC-API because it might not be compatible and its too high a barrier to be bothered checking and testing in detail just to determine the feasibility of this pathway. Declarations of intent are easier for potential users than trying to compare details and working out intent.

cportele commented 4 years ago

OK, so where is the list of "JSON structures that cannot be encoded as JSON-LD" so that "compatibility" can be checked?

I assume this is about JSON-LD 1.1.

akuckartz commented 4 years ago

@cportele Less than 24 hours ago @rob-metalinkage wrote:

I'll take an action to try to derive a JSON-LD context for the current core - unless someone has already done this or claims to be a ninja with all the tooling ready to go. (if its python and available pass it on and we'll seek to get it integrated into the RDFlib package!)

Such a context will help a lot.

rob-metalinkage commented 4 years ago

@cportele good question I know others have looked into deeper than I in the context of GeoJSON. I am hoping that if I can successfully build a context doc that covers everything we'll solve this from the other end :-). Now where to put this in my list of 20 urgent tasks ...

joanma747 commented 4 years ago

It needs to be tested. Can we generate a @context document for each response in OGC API to ensure semantic annotation and JSON-LD transformation. This could be task for a innovation in a future testbed or in testbed 16?.

rob-metalinkage commented 4 years ago

OK - I started chasing things down trying to find a machine readable definition of the core request and response payloads and gave up - its not readily visible from the spec. Can I confirm that the relevant materials are the schemas at: https://github.com/opengeospatial/oapi_common/tree/master/core/openapi/schemas

Also what is the status of the URIs in the XML namespaces? There is no infrastructure to resolve these and they are ad-hoc in nature. http://www.opengis.net/ogcapi-features-1/1.0

if this horse has bolted we can arrange to redirect these to something that behaves as a model in a canonical way - being set up at https://www.opengis.net/def/schema/{SWG}/{Schema}

we can put a "sameAs" to this - but what should the canonical form be. Having to put in explicit redirects for every adhoc URI pattern isnt a good place to be in, not is duplicating infrastructure setup to handle different dispositions of human and machine-readable versions of resources.

this throws up a whole bunch of issues of course which is why normative URIs need go through the OGC-NA for review...

cmheazel commented 3 years ago

Once the Features and Geometries JSON SWG is established, then we will have the guidance necessary to address this issue.