Open iDigBioBot opened 6 years ago
TestField | Value |
---|---|
GUID | c6adf2ea-3051-4498-97f4-4b2f8a105f57 |
Label | VALIDATION_COORDINATEUNCERTAINTY_INRANGE |
Description | Is the value of dwc:coordinateUncertaintyInMeters a number between 1 and 20,037,509? |
TestType | Validation |
Darwin Core Class | dcterms:Location |
Information Elements ActedUpon | dwc:coordinateUncertaintyInMeters |
Information Elements Consulted | |
Expected Response | INTERNAL_PREREQUISITES_NOT_MET if dwc:coordinateUncertaintyInMeters is bdq:Empty; COMPLIANT if the value of dwc:coordinateUncertaintyInMeters is interpreted as a number between 1 and 20037509 inclusive; otherwise NOT_COMPLIANT |
Data Quality Dimension | Conformance |
Term-Actions | COORDINATEUNCERTAINTY_INRANGE |
Source Authority | |
Specification Last Updated | 2023-09-18 |
Examples | [dwc:coordinateUncertaintyInMeters="1": Response.status=RUN_HAS_RESULT, Response.result=COMPLIANT, Response.comment="dwc:coordinateUncertaintyInMeters is in range"] |
[dwc:coordinateUncertaintyInMeters="-1": Response.status=RUN_HAS_RESULT, Response.result=NOT_COMPLIANT, Response.comment="dwc:coordinateUncertaintyInMeters is out of range"] | |
Source | ALA |
References |
|
Example Implementations (Mechanisms) | |
Link to Specification Source Code | |
Notes | The upper limit is one half the equatorial circumference of the earth. |
Comment by John Wieczorek (@tucotuco) migrated from spreadsheet: This same test could implement the upper limit of 2002000 as well.
Comment by Paul Morris (@chicoreus) migrated from spreadsheet: Define and rename as COORDINATE_UNCERTAINTY_IN_VALID_RANGE
Comment by Arthur Chapman (@ArthurChapman) migrated from spreadsheet: Agree with @JW on the suggested changes
Comment by Paul Morris (@chicoreus) migrated from spreadsheet: I concur with JW, define as withing range
A lower bound on 1 seems wrong to me.
The example for decimalLatitude
in Darwin Core uses 7 decimal places indicating the intention is to support very precise locations. With that, it seems reasonable to allow someone to declare e.g. 0.05m uncertainty.
Unless you are using differential GPS, sub-meter accuracy is not possible. With other current GPS systems, less than 4 meter accuracy is not even reasonable, despite what the device might tell you. The decimal places have nothing to do with this lower limit. Seven decimal places are recommended to faithfully preserve coordinate format transformations without loss or drift.
Checking the test data on this, I came to the same conclusion as @timrobertson100. If multiple satellite systems were used (with a good PDOP), are we moving toward sub-meter uncertainty? Do we anticipate improved resolution into the future? Odd to think that I was using differential GPS over 20 years ago.
If we get sub-meter accuracy, we have to start asking if the organism was bigger than a meter. I don't much see the point.
I agree with @tucotuco Even now you can get to cm accuracy or better using the right equipment. The new datum in Australia and New Zealand will be to decimeter accuracy - but we are looking at biological entities - plants are often >1m in diameter - animals move more than 1m and most people taking the observation don't put the recording device right on top of the entity in any case but take the recording standing beside the plant, etc. or maybe at a distance from an animal If you are doing work that requires greater accuracy - OK - but that is a rare exception - and there is no restriction on you being able to record that if needed.
The alternative is for very small organisms (bacteria, viruses) in soil or rock that could be highly localised. I figure my discomfort is that we allow for a value of 1.1 now but not for < 1.
The 1 as opposed to 0 is a protection. 0 is not a valid value, ever. I mean, ever. And without some number to delimit that people will put 0, or some integer field in a source database will round to 0. Until I ever here of a real-world case where it is actually necessary, I would urge not to go less than 1.
Hi, may I know why aren't there parameters such as bdq:minimumValidCoordinateUncertaintyInMeters and bdq:maximumValidCoordinateUncertaintyInMeters in this test unlike the following?
Hi @ymgan. Interesting. We designed the test to use the upper and lower extreme limits for the concept. We did not consider that there might be tighter constraints to test against in some circumstances. I don't see a reason why the parameters could not be added, but with the caution that they should never be outside the default extremes.
@ymgan Can you give a case where you may want to add a parameter for these tests. Most large upper uncertainties are for cases where one may only have a locality of "Australia" or "Pacific Ocean" or totally unknown but somewhere on Earth, etc. with no finer information. We would encourage people to use larger uncertainties in those cases, rather than guess. This test will largely identify where dwc:coordinateUncertaintyInMeters is recorded as <1 or a negative value, etc. which are likely to be errors. At this stage it is unlikely that coordinates could be recorded more accurately than 1 meter. Perhaps I can see cases where you may want to detect unlikely uncertainties in your database - <5m for example. Something to consider. We tried not to have parameterized tests where it was unlikely (or rare) that they would be used.
@ArthurChapman @tucotuco I had the question that I commented when I wanted to document the difference between the range that OBIS used and the range here. However, this is no longer necessary as we have just updated the range yesterday to align with the range mentioned here. I thought that it could be an oversight because this range test is similar to the range tests for depth and elevation. Thank you so much for your explanation, I think they make sense!
PS: I appreciate your hard work here as always! Thank you!!
Splitting bdqffdq:Information Elements into "Information Elements ActedUpon" and "Information Elements Consulted".
Also changed "Field" to "TestField", "Output Type" to "TestType" and updated "Specification Last Updated"