Open niquola opened 7 years ago
Argument for types as keys - we can easy implement fhirpath expressions like res.attr.as(Quantity) by just translating it into res.attr.Quantity expression, which is supported by most off json databases.
mystifying how this is meaningfully better than just translating into res.attrQuantity instead of res.attr.Quantity... you'll have to find better justifications than that
attrQuantity does not allow collections - which means it’s logicaly wrong. Also this is very difficult to express require in json schema and :missing search. Potentially users can put two elements attrType1 and attrType2 on the same resource and this format does not prevent such mistakes, as well it’s not obvious how to constraint it with json schema. It’s indistinguishable from just attributes postfixed with type, like birthDate. When working with this representation we need string operations on keys - that’s bad sign too. Too many signs ;) So semantically this is one element, but represented by different attributes.
that's a better argument...
Vision
Some resource’s elements could have variable type, in specification such elements have [x] postfix (for example Observation.value[x]). In JSON representation such elements encoded by substitution of postfix with specific title-cased type name:
Issue Statement
a) This approach to representation force innecessary constraint: Elements that have a choice of data type cannot repeat. I.e. They must have a maximum cardinality of 1. There is no absolute reason to force this besides format representation.
b) On the other side, most of Object Oriented implementations of FHIR usually provide convenient accessors to “polymorphic elements” like observation.getValue(), but when you work with data, without any object wrapper, you have to manually handle this. For example, check which specific Observation.value your are dealing with by iterating through the object.
c) JSON schema is a very popular way to specify the shape of JSON objects. But it’s features is not enough to describe, that only one of postifexed keys are allowed in the object (i.e. valueString or valueQuantity) or if the element is required say at least one of the keys should be present.
d) Implementation of FHIR search for missing elements, like Observation?value:missing=true, in databases native support of JSON is tricky with this representation.
These problems probably is consequence of contradiction, which was put into representation — we represent one entity (variable type attribute) with multiple entities (postfixed keys).
Solution 1
Here is variant of encoding, which solves listed concerns:
Variable type attribute could be encoded with object with meta attribute, for example,
@type
(or any other meta name) and value with a key named after type. So fix inconsistency with keys and explicitly embed type information into representation. We, also, relax arity one constraint and could have collections of variable type elements.a. collections of [x]
b. logical access to the element
c. JSON schema required and mutial exclusion
d. In databases we could access for search or indexes
Solution 2
When the first solution uses key named after type (i.e. someProp.string), another approach have a fixed key:
CONS
Having different types under the same path create problems for schemata, for example, when you index json in elastic-search, you have to provide mappings to identify element types (see https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping.html) and you could not have different types for the same path.
Also, this is error-prone in javascript applications - most of the times
someProp.value
will work until incompatible with code variant will arrive :(