Closed jlsogorb closed 3 years ago
Just a side note, I've had an issue recently with drawing an increased number of gpx tracks. Nothing to do with this library, but just the volume of points with leaflet and polys. I've ended up simplifying the gpx (so reducing the number of points, as mine were just exports from Strava with a point for every second), and it loads a lot faster. So if each gpx is quite large like mine, it could be worth exploring simplifying the gpx files.
Thank you, Ibrierley. Well, it's a good solution. I have tried to reduce the tracks to 500 points (the average was ~4000). The size of the files has changed from around 600kb to 70kb and the render time from 20'' to 6''. The only problem I see is when you download the file (I want to offer the best accurate track) so I've had to duplicate the files (one simplified to render and the original to download). By the way, I have used a batch command from GpsBabel software to do the multiple conversion easily (they were more than 300 files).
Console depuration result: 360 requests 38.7 kB transferred 26.1 MB resources Finish: 5.47 s DOMContentLoaded: 472 ms Load: 733 ms
I'm not sure if I saw this issue's title in passing recently or what, but I actually ended up thinking about this very problem a couple of nights ago!
It's true that the current implementation of leaflet-gpx
is going to be very memory intensive when you have many (or a very complex) GPX track loaded, because I hold references to the original XML, plus several arrays of all the points for their positions and elevations, etc.
This is clearly a tradeoff between data access convenience methods and memory usage. If you're experimenting, could you try clearing out the _info.elevation.points
, _info.hr.points
, _info.atemp.points
and _info.cad.points
arrays, and maybe clearing the reference to _gpx
as well after parsing (line 99)? Curious to see how much that would help/how much those consume.
Thanks!
I have commented all this section of the code. Is that what you mean? The results: 31 / 342 requests 555 kB / 83.2 MB transferred 554 kB / 83.1 MB resources Finish: 20.05 s DOMContentLoaded: 353 ms
/* _ = el[i].getElementsByTagNameNS('*', 'hr');
if (_.length > 0) {
ll.meta.hr = parseInt(_[0].textContent);
this._info.hr._points.push([this._info.length, ll.meta.hr]);
this._info.hr._total += ll.meta.hr;
}
_ = el[i].getElementsByTagNameNS('*', 'cad');
if (_.length > 0) {
ll.meta.cad = parseInt(_[0].textContent);
this._info.cad._points.push([this._info.length, ll.meta.cad]);
this._info.cad._total += ll.meta.cad;
}
_ = el[i].getElementsByTagNameNS('*', 'atemp');
if (_.length > 0) {
ll.meta.atemp = parseInt(_[0].textContent);
this._info.atemp._points.push([this._info.length, ll.meta.atemp]);
this._info.atemp._total += ll.meta.atemp;
}
if (ll.meta.ele > this._info.elevation.max) {
this._info.elevation.max = ll.meta.ele;
}
if (ll.meta.ele < this._info.elevation.min) {
this._info.elevation.min = ll.meta.ele;
}
this._info.elevation._points.push([this._info.length, ll.meta.ele]);
this._info.duration.end = ll.meta.time;
if (last != null) {
this._info.length += this._dist3d(last, ll);
var t = ll.meta.ele - last.meta.ele;
if (t > 0) {
this._info.elevation.gain += t;
} else {
this._info.elevation.loss += Math.abs(t);
}
t = Math.abs(ll.meta.time - last.meta.time);
this._info.duration.total += t;
if (t < options.max_point_interval) {
this._info.duration.moving += t;
}
} else if (this._info.duration.start == null) {
this._info.duration.start = ll.meta.time;
}
*/
Wouldn't it be cleaner to remove it from the source GPX files in the first place, so there is less to download and convert ? (maybe I'm misunderstanding a bit).
It depends what the issue is; is it the time it takes to download and parse, or the resident memory usage once it's loaded?
Well, I suppose the problem is the huge amount of points to draw. If I look into my gpx files, these are the numbers: 303 files, 619013 polygon points to draw. It's really big. Reducing the points to 500/track, as suggested by Ibrieley, goes down to 149699 points (maybe could try to reduce even more the number of them, say 200, without an important negative visual effect). By the other side, the resident memory used is really high as well, not sure how to measure it correctly (in Task manager appears a memory consumption of +200Mb). Another question: could it be possible to load all the tracks in a hidden layer (showing a progress bar) until all of them could be shown at once?
For what it's worth, I've reduced my GPX tracks by a factor of 20-30. I.e it started with 3000 points and ended up with 100, with no discenable visual/waypoint loss. All of my original GPX tracks were just recorded from Strava so were very bloated. It's pretty impressive how much you can cull without a problem. The difference in my case was pretty immense.
The other question is whether all 300 are "visible" in the viewed area, or if many are in different countries to what's being viewed and if they could be culled by boundaries or something ? There's often a few ways to hack around the problem.
Yes, I could reduce the points even more.
The tracks are all of them shown in the same map (it is a region). It's true that there are some zones where the density of the lines is too high ( I have tried to implement some kind of clustering, but without success).
The initial image is this:
I've got a lot less, but this may give you an idea of how mine performs with much simplified points, https://mappy.dabbles.info/ think there's about 50 routes.
Yes, your site renders very, very quickly (less tracks, less points...). By the way, didn't you think about showing more information of the tracks with events 'mouseover' and 'click'? Take a look at mine: http://www.luisso.net/rutas-de-trail
Closing as this isn't really a leaflet-gpx issue. If there are specific things that the L.GPX layer keeps references too that are too "heavy" and that we should trim down, then feel free to re-open.
Hi. Your code is really useful and interesting. Thank you for sharing. This is not really an issue, but a question. I am loading many tracks in an unique map (over 300 gpx, and they are going to increase in the future) . As they are loading one by one it takes more than 20 seconds to load the page, and if I open a depuration console I receive a lot of warnings like this: "[Violation] 'readystatechange' handler took 125ms" , and so on. As a resume of the loading I get this: 347 requests 79.1 kB transferred 83.2 MB resources Finish: 23.71 s DOMContentLoaded: 219 ms Load: 1.29 s My question is: Is there a way to make it more efficient or could I hide the tracks until all of them were loaded? This is my code (where I have a file called "ListaGpx.txt" containing a lis of all tracks this way:
var listado=["xxxvii-pujada-a-la-font-roja.gpx","xorretcati-rabosa-rodeo-despenyador.gpx",...
The code:This is a simplified code, if you want to see the working site with all the features that I have included: http://www.luisso.net/rutas-de-trail