NetTopologySuite / NetTopologySuite.IO.GPX

GPX I/O for NetTopologySuite
BSD 3-Clause "New" or "Revised" License
9 stars 2 forks source link

Yet another invalid GPX file #34

Open HarelM opened 5 years ago

HarelM commented 5 years ago

I've encountered another invalid GPX file that is described in the following issue: https://github.com/IsraelHikingMap/Site/issues/1043 Not sure how to properly solve this one though. I've opened a ticket for Samsung health but I don't have high hopes they will fix it...

zoharw commented 5 years ago

תודה אני חשבתי שאולי זה באג באתר ולכן בדקתי בסטרבה וגרמין ששם הקובץ נפתח ודיווחתי לכם בהתאם אינני בקיא בקבצים והפליא אותי שאין אפשרות להעלות קובץ אורגינאלי אלא חייבים לשנות סיומת בכל מקרה שוב תודה על הטיפול מבחינתי אפשר לסגור את התלונה בתקוה שאולי סמסונג יעדכנו את "יצור" הקובץ שלהם זהר

airbreather commented 5 years ago

Looks like we're not the only ones to see this: https://www.geocaching.hu/forum.geo?action=thread&id=kutyuk&message_id=475943

A benefit of (initially) staying close to the standard is that it keeps options open for how to handle non-standard situations, because in almost every case (including this one), it means that we only need to affect situations that throw errors.

So to help this situation out, I don't see a problem with adding these special-cases for when we're reading the direct children of the <gpx> element:

  1. <metadate>x</metadate> --> <metadata><time>x</time></metadata>
  2. <exerciseinfo>x</exerciseinfo> gets wrapped in <extensions></extensions>

I think it's fine for this special-case to be opt-out instead of opt-in, because we can still faithfully read in all the data from the file into our model.

airbreather commented 4 years ago

I originally identified this for the 1.0.0 milestone, but the rest of NTS updates took all my time. I'll get it for 1.1.

patcor commented 4 years ago

I originally identified this for the 1.0.0 milestone, but the rest of NTS updates took all my time. I'll get it for 1.1.

Did you remove it for the 1.1 release also? We are using the 1.1 release and a file like this (see below) completely hangs the process (100% cpu of one core) when we read the file. When we debug I can see that the code reaches the VisitMetaData() but then I don't get any more information after that.

This code will fix the issue, but it would be nice if the fix would be implemented in the package. And also it would be nice to understand why the process hangs. Throwing an exception that we can handle would be better.

Replace("<metadate>", "<metadata><time>").Replace("</metadate>", "</time></metadata>");
Replace("<exerciseinfo>", "<extensions><exerciseinfo>").Replace("</exerciseinfo>", "</exerciseinfo></extensions>");

GPX file to reproduce issue: `<?xml version='1.0' encoding='UTF-8' ?>

2020-05-08T12:50:45Z 20200508_164759.gpx 52.492123 1002 10005.971 0.0 0.14071980320389685 3557010 2.8130286 `
HarelM commented 3 years ago

I think this is the same as #41 when looking at this last comment. Too bad this wasn't handled when it was reported... :-(

airbreather commented 3 years ago

because in almost every case (including this one), it means that we only need to affect situations that throw errors.

Apparently I had completely missed the comments saying that this was causing the process to run into an infinite loop. Otherwise, I would have realized that this would allow malicious clients to trivially craft inputs that consume massive amounts of server resources.

I apologize for my slow response to this one.

I'll leave this open to consider capturing those transformations that I'd mentioned earlier:

  • <metadate>x</metadate> --> <metadata><time>x</time></metadata>
  • <exerciseinfo>x</exerciseinfo> gets wrapped in <extensions></extensions>
HarelM commented 3 years ago

Don't get me wrong, I missed this message too somehow, not sure how. I don't think it did in previous versions, otherwise I would have mentioned it in the original post. I think it was introduced in recent versions. If I had remembered reading this post I wouldn't have spent a few days digging into dump files and trying to figure out why my server resources are so busy :-) In any case, I'm happy this is now behind us and that the hanging issue is solved. Thanks again for all you hard work and detailed responses!