osmandapp / OsmAnd

OsmAnd
https://osmand.net
Other
4.65k stars 1.02k forks source link

Errors in elevation data #7380

Closed Posteinwurf closed 2 years ago

Posteinwurf commented 5 years ago

Elevation data in the maps still have too much noise. this is especially true in cities, where high buildings are interpreted as mountains (the resolution is not fine enough to distinguish between the building and the adjacent road).

One example from Berlin, where (a perfectly flat) road is wrongly seen Screenshot_2019-08-08-08-06-13 as having a 52 per slope!

This leads to a grave error in routing: ways with 52 per cent ascent or descent have to be excluded from the routing in the 'avoid hills' modes. so we have one perfectly flat road which is now excluded from the routing by error.

Maybe there is no final solution for this (unless we get perfect resolution data). but I hope someone has an idea how to ease this problem. maybe use elevation data from user-generated tracks (where available)? Maybe filter the data in big cities?

The resolution needs to come from the map data side (not from the the routing code side): The routing engine just cannot ignore a 52 per cent slope (no matter how short)! So there should be a solution to know that these 52 per cent don't exist.

Posteinwurf commented 5 years ago

Another thought: Maybe it is possible to generate the elevation values from the contour lines? these seem to be correct.

scaidermern commented 5 years ago

Maybe some smoothing can be applied to get rid of such tiny spikes?

vshcherb commented 5 years ago

Smoothing out will work bad for mountain areas.

Posteinwurf commented 5 years ago

Here is a related discussion, this time related to cycleway in the countryside.

I have observed the same issue several times in France this year.

https://github.com/osmandapp/Osmand/issues/5568

andreapx commented 3 years ago

Are there news about this? I noticed it hiking yesterday. I've recorded the track with Locus Map and it shows an ascent of 1289m. Ive imported the track into OsmAnd+ and it shows an ascent of 1700m Screenshot_20210817-162128 Screenshot_20210817-162150

sonora commented 3 years ago

As the screenshot above perfectly illustrates: Deriving slopes by simply comparing single neighboring elevation values (if that's what we do?) is not meaningful, and using this for a routing decision is plain nonsense.

To lessen the issue, you need to group points and work with averages, and a certain compromise will inevitably be introduced by the grouping selection.

This issue is similar to what I faced when writing the code for the ascent/descent summation in the GPX track analysis, you can look at my code there from several years ago: There the "grouping" I selected was looking at trend channel segments and ending each segment when a meaningful elevation change in the opposite direction of the current trend is encountered. From my recollection I selected 5 m elevation change as the meaningful turnaround threshold, meaning that elevation changes will be consolidated until the trend channel changes direction by at least 5 m (when a new consolidation starts).

Transferring this idea to a slope calculation would mean you now also report "the slope" not for each track point, but only (averaged) per each of these elevation trend channel segments. This averages out single slope spikes by summarizing "the slope" in each (variable) entire distance for which the trend seems either uphill or downhill. Of course this is a compromise, essentially just another smoothing mechanism, albeit one with a given spatial flexibility. Could be worth a try.

sonora commented 3 years ago

PS: @andreapx Having said all that, it seems that the data in the screenshot above seems severely faulty (it's not noise, as it is in one direction only), looks like there are very regularly spiky 'collapses' of about 100 to 200m (!) in elevation change (as if hiking through dozens of gullies).

That explains the huge difference between the overall ascents reported by different apps, as even small differences in the smoothing parameters would make a huge difference in that data set, an indication of the fact that the underlying data is very sensitive to interpretation.

I think there is a hardware weakness in play here which cannot easily be fixed without serious compromise for legit other use cases (e.g. dune hiking). As a particular workaround, maybe setting a precision filter while recording would help you eliminate the spikes if in fact they are bogus in your recording.

pebogufi commented 3 years ago

To lessen the issue, you can also use a "moving average", which reduces such "noise".

sonora commented 3 years ago

@pebogufi Yes, easy to code, but tough conceptually: I find no reasonably intelligent algorithm to determine the length of the sliding interval.

Afterthought, @vshcherb : Maybe in order to solve the original issue of routing becoming severely affected by artifacts,