Closed PaulStets closed 7 years ago
@sonora this is somehow related to our distance computation i.e. track distance & recorded distance widget. It is a tricky question but I don't remember we ever calculated distance with "ele", so devices such as Garmin always report longer routes in the mountains.
It would be useful if OsmAnd could use Naismith's rule when reporting time for walking.
Adding this to the route analysis would be quite easy as the amount of climb and distance is available, but including this in the remaining time shown when navigating is more difficult, since it looks like the elevation data is not currently available in that part of the code.
Two stories here:
Travel time forecast: Yes exactly, njohnston's comment hits the nail on the head. Even though you may consider this 'old school', and we may try to improve upon it, it has always been my assumption and personal practice for many years that every experienced hiker or expedition planner would naturally treat OsmAnd and the distance data we deliver as the 2D projected data.
You will then filter in the elevation changes (we now show the elevation profile for routes rather precisely), the gradients and their continuous lengths, terrain estimation and trek/ground/vegetation viability, personal fitness, baggage, group size, and so on. Then, depending largely on personal preference you would either use Naismith's rule (which allows for extra time for elevation changes), Scarf's rule (equating elevation changes to additional distance), personal pace considerations, or bluntly experience to arrive at the individual route card.
Long story short: For anybody experienced enough, the data OsmAnd delivers already today is the perfect basis for such planning activities. For mere 2D walking tours, what OsmAnd delivers is also the correct time projection, just like for car navigation (where we also provide reasonable arrival time estimates but make no extra allowance for e.g. winding moutain roads). On the other end of the scale, where terrain, vegetation, ground, trail classification etc. com into decisive play, I see little useful things we could automatically provide/project today to replace the hiker's experience. So the gap we could try to cover is the middle ground of the scale, considering ascents and descents while assuming some standard 'hiking trail' viability and using Naismith, Scarf, and/or or pace adjustments for a projection.
Have not looked at our code to this respect, neither may I find enough time for a few weeks here, but I think we already do a simple pace adjustment calculation in bicycle routing now, so it may not be too difficult to extend that to ped routing.
Distance calculation: Very likely we do today use the 2D projected distance only. Need to look at the code how easy this would be to correct. People coming from paper maps will never have had an issue with this because they would use the above mechanisms anyway to convert distance into what counts for the situation, which is mostly travel time. So some may actual argue it is not needed we adjust our distance formula. Not sure, let me think about it. Some thoughts on that warmly welcome here.
Looks like we should apply the following concept:
(A) All distances where no route has been calculated, like e.g. the distance indication to search results or POIs, are 2D ('as the bird flies'). This is already implemented.
(B) For all distances along a calculated route we could now apply a 3D distance formula which factors in the elevation changes along its elevation profile.
(C) For the travel time indication along such a calculated route we could try to apply something like Naismith (at least for ped/bike routing).
Right?
Looks like we use MapUtils.getDistance(
, which needs to be enhanced by a 3D component if we have elevation deltas and not just Lat/Lon deltas to provide.
I was mostly thinking about GPX data and not apply to all other screens cause it could be quite confusing. I'm concerned that some noise could be make realtime calculation tricky. So the safest would be to try change it in GPXAnalysis only first.
It is really tricky. I doubt there is really enough benefit to enhance our entire distance formula to 3D to make sure every single point-to-point distance calculation we perform is elevation-aware. I think most things need to remain 2D.
ln GPX analysis it is especially tricky because there the (recorded) elevation data carries a varying amount of noise. Since we also use distance as an axis to graphically present the data, we may create funny distortions. Chances are that for axis representation (and some other purposes) we want to always stay with and refer to the projected 2D distance.
I was thinking we start by just providing a 3D length for calculated routes where we now use a well-behaved elevation profile from the SRTM grid ever since we provide elevation profiles for routes. Route distance is likely also the major use case behind this issue?
For GPX analysis we could provide an overall 3D length (number value only) on the analysis screen only, derived by summing over the elevation trend channels I use for ascent/descent summation (simple Pythagoras each). This fixes the noise problem we otherwise have there. But this method would not provide 'live' updates of e.g. remaining 3D distance when following a route-by-GPX (without always major CPU intensive recalculations).
Meaning: It would be a mechanism a little different from what we need for route length, there we would probably perform this summation point by point and not using trend channels but using the SRTM data which is already smooth.
PS: Sanity check. I briefly went through hiking guides and it turns out that even the tougher mountain hikes do usually not exceed a ratio of 2000 m elevation gain per 10 km of trail. For that value the corrected 3D distance would actually be just under 10.2 km, a mere 2% effect vs. the 2D distance. If you ask me, not worth doing anything at all here.
Paul's initial example is flawed: If the 2D distance between the 2 mountains along the trail is 2 km, OsmAnd will certainly display that.
But if the geometric assumption is that the trail of each mountain goes down in a straight line to the valley in the middle (i.e. 500 m 2D away from each peak and is per Garmin 1 km long in 3D), the dip would be 866 m over 500 m, an average whooping 173% slope!
I suggest we close the issue with no action unless someone provides an example where it would actually matter.
@sonora: I wanted to respond to this earlier but it slipped my mind. Your point about how relatively little slope impacts distance is interesting. I agree that 3D distance is not worth the effort.
However, I do think that providing a more realistic time estimate is important and useful. I looked into adding this myself, but quickly realised that the elevation information is only available in the GPX analysis part of the code. (Even the route preview elevation information is based on analysis of a temporary GPX file for the route.)
One simpler option could be to ignore elevation data completely but to make the time remaining based on user's average speed during the route/navigation so far. Overall I think this is a poor option though, because it doesn't offer good handling of routes that are initially flat but then later become steep.
The better (but more difficult) option is to calculate the amount of ascent for each "leg"/way of a route, and cache this (with the cache being invalidated and repopulated with different data if the route is recalculated), and then, as the route progresses, estimating the remaining time using Naismith's rule (or similar) based on the remaining distance and remaining ascent (for the ways in the route which have not yet been traversed). @vshcherb: does this sound feasible?
Yes, I had already made the point way above that 3D distance calculation (the original scope of this Issue) and elevation change aware hiking time projection are 2 different animals. For the sake of clarity we could move discussing the latter to a new Issue.
Along the lines of your thinking: Naismith's and most similar rules do not depend on the elevation profile, but are rather simply based on 2 parameters only, the projected distance on the map and the total ascent to be covered.
If in OsmAnd, during an ongoing navigation, you go to the Directions dialog and tap on the route details (the line showing the remaining distance/time information), you can see that we provide the number value for the remaining total ascent on the resulting route details screen (below the route's elevation graph). This means that on that screen at least we could very easily provide a Naismith-corrected estimation for the remaining hiking time.
Chances are that yes, if we wanted to provide that ongoing in the arrival time estimation on the map screen widget, we would have to cache the number array of "remaining total ascent" together with the route information (for pedestrian routes), not a low impact change to our code, I think. But also far from impossible.
PS: The more straight forward option is we do not cache the ascent array, but simply calculate the total ascent for each route segment (only for pedestrian routes) at the time of route calculation and immediately filter the corresponding Naismith correction into the "travel time needed" value we store for each route segment (much like we filter in penalties for stop lights, etc.). Probably a simpler change, but it means we trigger a GPX analysis (to find the ascent value) for each route segment at the time of route calculation.
PS2: All we discussed may enormously increase the route calculation time, because the time needed (and I guess people would want us to optimize the Naismith-corrected time needed) is the key parameter to optimize when the algorithm searches for the fastest route ...
@sonora: I agree that we should create a new issue for elevation-aware time estimates.
Providing a more accurate time estimate even just in the route details screen would be useful. Like you say, it's trivial to implement there since the total climb is available in analysis.diffElevationUp
. However, a much lower remaining time while following the route could cause confusion.
I've been looking at the code and it seems that RouteDataObject
has heightDistanceArray
, which is a list of heights at points along a segment. If my understanding is correct, this could be useful for calculating the remaining ascent to apply Naismith's rule in getLeftTime
in RouteCalculationResult
.
PS2: All we discussed may enormously increase the route calculation time, because the time needed (and I guess people would want us to optimize the Naismith-corrected time needed) is the key parameter to optimize when the algorithm searches for the fastest route ...
I don't think that this increases the route calculation time. The only change needed for routing engine is a change of routing.xml, just change obstacle_srtm_alt_speed section of pedestrian to:
<way attribute="obstacle_srtm_alt_speed"> <if param="height_obstacles"> <gt value1="0" value2=":incline"> <select value="0"/> </gt> <select value="6"/> </if> </way>
Now you get 6 seconds penalty per ascent meter (= Naismith, 1h per 600m). Inclines smaller 1% are filtered out.
For ETA, calculateTimeSpeed() in RouteResultPreparation class could be used (add the height penalty to distOnRoadToPass). Height array can be calculated by calculateHeightArray() for every RouteDataObject (road) of the route. Now you have the time (including height penalty) for all segments of the route.
Yes, but I think the height array may be obtained by our GPX analysis code which is rather cpu intensive. In any case it sounds feasable, will open a new user story in an issue to docunent this better so we can decide if this should be put in our backlog.
obstacle_srtm_alt_speed - will definitely increase calculation time but it is not clear how much cause it impacts the routing result where calculateTimeSpeed doesn't change the route. In mountain areas we've got 300-500% time slow down for bicycle routing. Anyway as @Zahnstocher pointed out it is pretty easy to try
Just wanted to add that Maps.me does include the elevation somehow, because you have much longer time for routes going upwards. And quite realistic as far as I have tested. And the calculation is extremely fast. How did they do that?
Time difference should work now, I implemented and tested already in November, and it should be in our 2.9 release.
@sonora Why isn’t it activated by default? It wasn’t in my installation. I had to activate it in the routing options by clicking on the gear and selecting “use height data” (translated, don’t know how it is in english interface). Please make it default.
That's maybe a valid hint, I think the feature is well tested by now. But I do not want to make any uncontrolled code changes now so close to the next release (@vshcherb)
Some time has passed, can we reconsider this?
@sonora and @vshcherb I am still waiting for this to be activated, because I see no reason for not turning it on by default. I use it all the time, and the times calculated are quite good. I have to activate it in every new installation for my hiking clients when I help them install Osmand, which is really annoying. :-)
Could you please explain me, why you don’t want to activate it by default? What is the problem?
Ok, the last two days I watched the travel time forecast. It was very good for even and ascending ways. But it was very bad for descending ways.
For example I had a way going down (strictly decreasing!) about 450 m of altitude in 2.5 km. The given ETA was 30 min. In reality the fastest walker of my 14 persons group needed (not running!) 60 min, and the slowest 90 min (quite slow, though). The majority of the group arrived after 70 minutes.
Do you need more examples to improve the ETA for descending ways? Oh, well, there seems already to be a useful rule: The Langmuir corrections.
Is it possible to use and play around with the parameters mentioned in #3680 and #4804 ? Or how is the ascending time calculated at the moment? I would like to try it – is it possible withouth recompilation?
Just discovered this old thread, which I never had picked up on again: Yes, descending trail portions have no correction for now from what I remember, and yes, Langmuir would be the next order of corrections to implement. Somewhat more involved, though, not convinced it is really worth it, maybe some time. :)
OK, but still: Why not activate it by default? Just like all other modern programs do it. Is there any disadvantage?
Well, and if you give me a hint where in the code the Naismith rule is implemented, I would like to implement the Langmuir rule and maybe create a pull request after testing.
Ok, found it: OsmAnd-java/src/main/java/net/osmand/router/RouteResultPreparation.java
//for Naismith
in file RouteResultPreparation.java
routing xml
, parameter height_obstacle
in pedestrian profile@sonora "Activation by Default", pull request created: https://github.com/osmandapp/OsmAnd-resources/pull/535
@Zahnstocher Yes, thanks, my concern was more if there are situations where this could create problems, like of you have no elevation data activated or present. But let's test anyway!
PS: Are you sure it has been tested enough to default it in the bicycle profile, too? I do get some weird assymmetries between out and return trip when the ele checkbox feature is activated, but not test bicycle much.
Please note that using elevation data horribly slows down calculation speed in mountain areas.
subtract 10 minutes for every 300 meters of descent for slopes between 5 degrees and 12 degrees add 10 minutes for every 300 meters of descent for slopes greater than 12 degrees this can be translated into:
angle of 5 ° ⇔ ascent of 8,7 % angle of 12 ° ⇔ ascent of 21,3 % ± 10 min/300 m ⇔ ± 1 h/1800 m ⇔ ± 3600 s/1800 m ⇔ ± 2 s/m
For calculating the ascent or descent I need the distance
d
defined in line 254 inRouteResultPreparation.java
. The only thing that needs to be implemented is two more if clauses after line 270, which isif (heightDiff > 0) { //ascent only
which should be (as far as I understand the code)
} else if (heightDiff<0) { //descent only float descent = heightDiff / d; if (descent >= 0.087f && descentPerc <= 0.213f) { distOnRoadToPass -= heightDiff * 2.0f; //Langmuir's rule: subtract 10 minutes for every 300 meters of descent for slopes between 5 degrees and 12 degrees } else if (descent > 0.213f) { distOnRoadToPass += heightDiff * 2.0f; //Langmuir's rule: add 10 minutes for every 300 meters of descent for slopes greater than 12 degrees } }
@sonora: I didn’t understand or find what you said about
routing xml
. @Zahnstocher: Do you think that is ok?
@xmd5a2 horribly means what?? Statistics, numbers?
Any idea how maps.me does it, because it really seems very quick!
@erik55 I don't have statistics but when I tested that, in mountain areas calculation speed was slower 2-5 times than in plain areas.
@erik55 I also implemented it very similar, but I added a check, so that "time - correction time" is not < 0.
if ((d / speed - correction) > 0.0)
But I'm not satisfied with the results of Langmuir, because it is not a continuous function. Langmuir makes a huge step at -12 degrees, this seems not very logical and leads to strange results: https://en.wikipedia.org/wiki/File:Hiking_speed.svg
@erik55
Any idea how maps.me does it, because it really seems very quick!
maps.me uses OSRM backend, which uses extremely fast contraction hierarchies. But this needs complete pre-calculations, thus it is not possible to change profiles on-the-fly, everything is hard-coded. But there are other routing engines:
@sonora
PS: Are you sure it has been tested enough to default it in the bicycle profile, too? I do get some weird assymmetries between out and return trip when the ele checkbox feature is activated, but not test bicycle much.
Why do you think asymmetries are weird? The return trip has a reverse elevation profile which will lead to a different return route in many cases. because according to the routing.xml the height obstacle is not symmetric for ascent and descent. I tested it a lot, the results with elevation data are much better, but sometimes it is horrible slow in bicycle mode.
@xmd5a2 Yes, elevation calculation is very slow in bicycle mode, thus it may be better to not enable it by default in bicycle mode.
By'weird assymmetry' I mean that I have found bicycle routes where the outbound trip (uphill) was calculated to take 7 minutes, on a rather direct track, while the return trip (downhill) was calculated along a much longer tertiary road taking 16 minutes. Deactivating the elevation tick box resulted in the shorter route and travel time in both directions. Can post examples in the next few days if needed.
@Zahnstocher Well, you are right with the non continuous function. But so far I thought this would be ok, because I assumed Langmuir – as a somewhat famous mountain walking guide/leader – couldn’t be wrong. But when I tested it with the numbers I posted on 2018-08-20, and I get very bad results!
For the 450 m height difference within the 2500 m walk the slope angle is 10.20 °. Therefore Langmuir would suggest an increase of speed (which I find rather unrealistic remembering the situation).
Calculating the time needed with Langmuir results in
(1) 2.5/5×60 − (450×2/60) = 15 min (it was 60 min in reality!).
But then using Tobler’s function I get a resulting speed of
(2) 6×exp( − 3.5×(450/2 500 + 0.05)) = 2.68 km/h.
This leads to an estimated time of arrival (ETA) of
(3) 2.5/(6×exp( − 3.5×(450/2 500 + 0.05)))×60 = 55.91 min
which is astonishing accurate (real time for fastest walker of group was 60 min, middle was 70 min; but the time was elongated due to some photos the group members had to take of the landscape). Therefore I’d say: Perfect! And comparing to the Naismith’s rule (on the graph you posted), it also seems legit for ascending slopes. I think, as it is a function, it could be implemented without if clauses, with possible speedup of the calculation. We should try it.
PS: I seem to do something wrong, as the table with sample values in the wikipedia article shows higher walking speeds, for example for a descending slope of 10 ° it shows a resulting speed of 3.86 km/h and therefore an ETA of
(4) 2.5/3.86×60 = 38.86 min
for the 2500 m walk descending 450 m (but maybe I’m just tired and should rethink it tomorrow).
@Zahnstocher and @sonora If there are potent routing engines, why do we need to reinvent the wheel? Why not just use any of them?
And what do you mean by precalculated? It is possible to change the profile in maps.me. I don’t understand what you mean with
not possible to change profiles on-the-fly
Big problem with Langmuir in my experience is that true downhill speeds, and even if there is a speed gain or contrary a slowdown, depends on surface a lot, rolling stones, sand, rough trail all make a big difference while Langmuir only considers slope - that's why I wss reluctant to implement in the first place. Also: the slope is rarely homogeneous, hence you would need to take a smuch more segmented approach for better precision.
Routing engines: Long story. The more flexibility you need, the more you have to calculate in situ rather than use pre-calculated data.
@erik55
(2) 6×exp( − 3.5×(450/2 500 + 0.05)) = 2.68 km/h.
You calculated ascent. Decent is: 6×exp(−3.5×(abs(-450/2500 + 0.05))) = 3.8067 km/h.
And what do you mean by precalculated? It is possible to change the profile in maps.me. I don’t understand what you mean with
AFAIK (in maps.me) you can only switch between the pre-calculated profiles (car, pedestrian, bicycle, ...), but it is not possible to change the profile itself (for example, avoid stairs, avoid motorways, avoid ferries, ..., use height data or not, ...).
If there are potent routing engines, why do we need to reinvent the wheel? Why not just use any of them?
I don't know. But one reason probably is, that none of them supports live updates.
Just for the record: simply setting the default for using the elevation data in the routing.xml, as we do it now (false for bicycle, true for the pedestrian profile) does not really initiate this as if it were a profile-dependent setting, as it is merely a routing attribute for now.
From my tests, if you want it preset reliably, you need to initialize manually once per profile by deliberately checking or unchecking the checkbox (as desired, in both navigation profiles).
As we have it now, if the checkbox has never been touched after a fresh install, it gets initialized once per app start, either for the bicycle or the ped default, whichever routing profile is used first by the user (and it will remain the same for the other profile then). If this is a big issue, it needs some refactoring, which probably deserves a separate issue. Not sure if any other routing attributes would need such an improvement, too.
Question: in the new version of Osmand height seems to be taken into account also when walking downwards (at least there is a difference in time when switching between use elevation data and not use elevation data). Is that true?
If so: congratulations!
Yes, we introduced it for pedestrian routing as well.
If the distance between two mountain peaks is 1 km as a straight line, and the distance from the peak to the foot of each mountain is approx. 1 km each, OsmAnd shows the walking distance of 1 km, whereas Garmin shows 2 km.