Closed raffazizzi closed 9 years ago
This seems like a sensible solution to me, but additional discussion from Council is warranted before any deprecation. (Devil's advocate argument.)
Original comment by: jamescummings
Deprecation added to @degree
on precision in r12606.
Original comment by: hcayless
Original comment by: hcayless
With the addition of att.ranging to <precision> in release 2.5.0, this seems like a good time to reexamine the utility of the
@degree
attribute for precision. The need to express a lack of accuracy comes in to play in cases like dates, estimates of the number of missing characters in a text, and estimated measurements. Vagueness is often indicated in the text using a qualifier like circa or about. We can think of examples like the dates "ca. 18 BCE" or "24 CE – ca. 38 CE"; "[ – c. 15 chars. – ]" in a text indicating a gap of about 15 characters in width; "6th century CE", or even "ca. 6th century CE".In some of these cases there is an unexpressed "fudge factor" being introduced by the use of circa. In the case of "6th century CE", the asserted accuracy is inherent in the date itself—it's accurate at the level of the century, no lower.
If we want these kinds of estimates to be machine processable (so they can be picked up in searches, for example), the numbers, however vague, must be quantified. The members of att.ranging (
@min
/@max
and@atLeast
/@atMost
) provide a straightforward way of quantifying the accuracy of an estimate in a way that@degree
does not.@degree
is defined as "indicates the degree of precision to be assigned as a value between 0 (none) and 1 (optimally precise)" and as such is only useful as a relative measure of precision (measured against other precision elements). Despite its being a number, it does not quantify accuracy at all, and therefore does the same job as@precision
(which has the possible values "high" | "medium" | "low" | "unknown").In sum, having a numeric value to express the level of accuracy that can't be used to actually quantify the level of accuracy is a Bad Idea. It is misleading and confusing, even more so now that we have the tools to actually quantify accuracy. It is in use in the wild, so immediate removal is impossible, but it should be marked as deprecated, with a note explaining how to express accuracy using att.ranging (or alternatively simply to express a lack of confidence in the accuracy of a number using
@precision
).Original comment by: hcayless