We could broaden the image slot to include other data types.
Evidence goes on the depicts edge - this could be an individual that aggregates information.
Experimental details can be connected to J. I’m hoping we can link to external sources for this, as I worry we don’t have the resources to curate + I’m reluctant to wade in to details of experimental modelling.
With this approach we can add additional axiomatisation to ‘I’ that is useful for query/inference. As we learn more about I, we can add a more precise class.
This needs to be combined with a different approach for definitional (reference) data.
We already have individuals for dendrogram clusters with an exemplar (definitional) relationship to classes.
@cmungall Comments please: Should be be looking at BioLink instead?
Proposal for core model:
Extended version of the VFB model for images could work:
(C:Class:Neuron)<-[:INSTANCEOF]-(I:Individual:Neuron)<-[d:depicts]-(J:Image)
We could broaden the image slot to include other data types. Evidence goes on the depicts edge - this could be an individual that aggregates information. Experimental details can be connected to
J
. I’m hoping we can link to external sources for this, as I worry we don’t have the resources to curate + I’m reluctant to wade in to details of experimental modelling.With this approach we can add additional axiomatisation to ‘I’ that is useful for query/inference. As we learn more about
I
, we can add a more precise class.This needs to be combined with a different approach for definitional (reference) data.
@cmungall Comments please: Should be be looking at BioLink instead?