Semantic-Observations / obs-models

Alignment of semantic observation models
3 stars 0 forks source link

align SSN:Observation to OML:observation #21

Open ramonawalls opened 9 years ago

ramonawalls commented 9 years ago

Should they be equivalent classes?

dr-shorthair commented 9 years ago

Nope. ssn:Observation is an information object that describes an observation. oml:Observation is an observation event or activity.

The original derivation of SSN from DOLCE hinted at this issue, and [1] confirms my interpretation using PROV core classes. In fact, Compton et al. needed to introduce a new class 'ActivityOfSensing', which looks like it corresponds with oml:Observation. Here's the PROV alignments (red arrows=disjoint classes):

image

and my current interpretation of the implications for ssn-oml alignment:

:ActivityOfSensing owl:equivalentClass oml:Observation . ssn:featureOfInterest owl:equivalentProperty oml:featureOfInterest . ssn:observationResult owl:equivalentProperty oml:result . ssn:observationResultTime owl:equivalentProperty oml:resultTime . ssn:observedProperty owl:equivalentProperty oml:observedProperty . ssn:Sensing rdfs:subClassOf oml:Process . ssn:Sensor rdfs:subClassOf oml:Process . ssn:observedBy rdfs:subPropertyOf oml:procedure . ssn:sensingMethodUsed rdfs:subPropertyOf oml:procedure .

[1] M. Compton, D. Corsar, K. Taylor, Sensor Data Provenance: SSNO and PROV-O Together at Last, in: 7th Int. Work. Semant. Sens. Networks, 2014: p. 16. http://knoesis.org/ssn2014/paper_9.pdf (accessed April 2, 2015).

dr-shorthair commented 9 years ago

(BTW - am currently revising the OML paper to include this material.)

ramonawalls commented 9 years ago

Thanks for clarifying, Simon. We will still need to decide whether or not we include SSN in our mapping. I suggest that the main focus of the mapping (e.g., the figure) be OBOE, OML, and BCO, and that we provide other mappings as auxiliary findings (because the authors of SSN and PROV were not part of the mapping process).

wernerkuhn commented 9 years ago

It would seem an omission to me if SSN was not part of this effort. The fact that no direct authors were present at our workshop was unfortunate and should not affect the overall alignment effort. Having done the original research underlying SSN, I will try my best to account for it, pulling in Jano and others where appropriate. 

--  Werner 

ramonawalls commented 9 years ago

Thanks, Werner. Sorry for spacing out on that. Yes, it does indeed make sense to include it in that case.

wernerkuhn commented 9 years ago

Thanks, Ramona, could you point me to the evolving paper draft again please?

--  Werner 

ramonawalls commented 9 years ago

Looks like you found it!

https://docs.google.com/document/d/1Yn1RscG1_EOKenMs3QCYDfNVTU2LTHZryOs57_VMxbo/edit

wernerkuhn commented 9 years ago

Ah, I was not sure whether it had moved elsewhere, thanks!

dr-shorthair commented 9 years ago

Note that the figure above is from [2](revised and re-submitted to Semantic Web Journal), where the SSN/PROV/OML alignment is presented at a little more depth.

[2] Cox, SJD "Ontology for observations and sampling features, with alignments to existing models" Submitted to Semantic Web Journal, http://www.semantic-web-journal.net/content/ontology-observations-and-sampling-features-alignments-existing-models

dr-shorthair commented 8 years ago

An updated version of [2] is now on the Semantic Web Journal site. No significant differences in the paper for our purposes.

http://semantic-web-journal.net/content/ontology-observations-and-sampling-features-alignments-existing-models-0

dr-shorthair commented 8 years ago

Semantic Web Journal paper officially accepted for publication (2015-12-12) http://semantic-web-journal.net/content/ontology-observations-and-sampling-features-alignments-existing-models-0

dr-shorthair commented 8 years ago

Also see this presentation from AGU: http://www.slideshare.net/drshorthair/pitfalls-in-alignment-of-observation-models-resolved-using-prov-as-an-upper-ontology

ramonawalls commented 8 years ago

Great Simon! Glad to see it published.