Closed smathermather closed 3 months ago
Community conversation on this here: https://community.opendronemap.org/t/default-gps-accuracy/19510
I think this would warrant a decent amount of datasets testing; over-constraining the BA problem this way might make the program more sensitive to GPS measurement errors, causing failures when such measurement errors are present.
I don't know how often such errors happen in practice. Changing the default could be OK if it's beneficial for most datasets (and doesn't substantially raise the failures).
In the dozen sets I tried, not a single failure at 2m. Sounds like @Saijin-Naib uses 3 and has yet to see a failure.
The only time I need to relax it is when using ground-level mobile phone collects with my TeraCube, but even the default 10m isnt nearly coarse enough. 30m is normally needed
Edited to 3m on @Saijin-Naib and Mike_F's recommendation.
For a couple years now, I've been of the mind that the default GPS-Accuracy under-constrains the model using GPS values, resulting in calibration issues which presents as excessive doming in the elevation data produced. Based on processing dozens of datasets on the community forum, the accuracy (or in this case, really a measure of consistency) for individual flights (rather than aggregate datasets) can be safely tuned to 0.2m / 2 decimeters.
Doming in dataset from Monrovia, Liberia
For aggregate (multi-flight) datasets where there can be GPS consistency differences over time, it's safer to set this value to a larger value such as 2m.
I'm currently testing the switch to 2m with roughly a dozen community datasets shared with me over the last few years:![ezgif-6-1a59c8ad6c](https://github.com/OpenDroneMap/ODM/assets/1174901/e0ef54cb-c527-4967-8f57-2e6b820703c0)
Thoughts? Any good counter examples? I have yet to break a dataset by reducing to 2m, but I could be missing salient examples.