SamR1 / docker-fittrackee

GNU General Public License v3.0
13 stars 0 forks source link

0.4.5-docker: GPX file max size ignored #6

Closed sgofferj closed 3 years ago

sgofferj commented 3 years ago

I set the max. size for a GPX file to 100MB in the admin settings. However, FT rejects an 1.5MB large GPX file as too large.

SamR1 commented 3 years ago

Hi,

I did not manage to reproduce this issue locally (with and without docker), with max. size of uploaded files set to 100Mb and a 1.7Mb file.

If you are using nginx for instance, note that the default value for client_max_body_size is 1Mb. Uploading a file larger than 1Mb leads to the same error: Capture d’écran_2021-02-20_12-07-13 I will add a warning in documentation.

sgofferj commented 3 years ago

Hmm, yes, I do see a 413 but I'm using Traefik without buffering. Traefik logs don't show any error... Edit: And I can upload huge ZIP files without 413 - strange. I'll investigate further.

sgofferj commented 3 years ago

OK, here comes the fun...: I wipe everything, start a whole fresh container, make init, log in... Upload of bigger GPX files works fine. Then I upload one ZIP file which includes all 67 GPX from January. Result is an error: Screenshot_20210220_142905

Screenshot_20210220_142849

After that, trying to upload large GPX files produces a 413... I can put the ZIP up for download if you need it...

SamR1 commented 3 years ago

what is the API response ? (displayed in Network tab)

sgofferj commented 3 years ago

{"status": "fail", "message": "Error during picture update, file size exceeds 1.0MB."}

However...: Screenshot_20210220_181305

sgofferj commented 3 years ago

That's the response to the large GPX file after the large ZIP file error. The response to the large ZIP file is "502 Bad gateway".

SamR1 commented 3 years ago

{"status": "fail", "message": "Error during picture update, file size exceeds 1.0MB."}

The message is incorrect, it should be "Error on file upload, file size exceeds 1.0MB.". I'll fix this (#72).

About the "502 Bad gateway", I think it is caused by a gunicorn worker timeout (processing a large file is too long). That why for large files and archives, import should be asynchronous (#19). Limiting files size allows to avoid that kind of error with synchronous import.

what's the average size of your gpx file? (in my case, gpx files I uploaded are no larger than 1MB, for workouts of 2-3 hours max and without heart rate)

sgofferj commented 3 years ago

I think there is another problem that needs looking at. I can upload larger GPX files no problem if I increase the file size limit in the app admin settings. Just when the upload of a ZIP file fails, I cannot upload larger GPX files after that any more, so the upload fail must break something. I noticed that the extracted files remain in the extract directory after a fail. Could that be the problem?

Most of my GPX are below 1MB. However, I do log 1 point per second and I usually wear a heart sensor and I log pretty much every walk - and I do almost everything on foot. So there are larger files frequently. In summer, we hike over 10km regularly, so more large files then.

SamR1 commented 3 years ago

I think there is another problem that needs looking at. I can upload larger GPX files no problem if I increase the file size limit in the app admin settings. Just when the upload of a ZIP file fails, I cannot upload larger GPX files after that any more, so the upload fail must break something. I noticed that the extracted files remain in the extract directory after a fail. Could that be the problem?

I try to import a very large gpx file on my prod instance (after updating FitTrackee files config, nginx and gunicorn config to allow up to 100Mb files). I had 502 error: parsing gpx file fails silently (no error in FitTrackee logs). It's not the upload on server part that fails, but processing gpx content (in this case, gpxpy can not handle this file and raises no exception, but in others cases, gunicorn workers can time out due to long processing duration).
But I could upload a zip archive (4,5 Mb avec 23 files (1.6 Mb per file)) right away without errors.

Indeed, processing gpx files needs some improvements to handle large files. That part hasn't really changed since the beginning. I was waiting a fix on gpxpy issue on naive datetime to work again on this (so far I have no issues with uploading my workouts, but I only upload files one by one and they don't exceed 1Mb).

Most of my GPX are below 1MB. However, I do log 1 point per second and I usually wear a heart sensor and I log pretty much every walk - and I do almost everything on foot. So there are larger files frequently. In summer, we hike over 10km regularly, so more large files then.

Thanks for the detail. What's the average size of these 10km workouts? and the size and content of archive that fails (number of files and min-max file size)? I will try to test with larger files.

sgofferj commented 3 years ago

Here's the exact ZIP file that failed: https://cloud.gofferje.net/s/jEkXZM4F6pMsWGr

I'll have to dig through my archive to find some longer hike files from last summer.

Edit: I also usually would upload one by one, but FT looks so promising that I decided to import my archives at least from this year. That's how I ran into the problems.

SamR1 commented 3 years ago

thanks for the archive and your feedbacks. (you can delete the link if you want, I downloaded the archive).

SamR1 commented 3 years ago

I had error when uploading the archive with docker locally (no error without docker).
So I modified Dockerfile to use gunicorn directly with timeout=300s (default is 30s) and I was able to upload the archive without errors (but I still think uploading files needs some improvements).

Can you try the updated version (see branch update-dockerfile)?

sgofferj commented 3 years ago

I'll try later today.

sgofferj commented 3 years ago

Sorry, life got in the way. Testing now.

sgofferj commented 3 years ago

I don't see this error anymore with the update-dockerfile branch. Confirmed fixed.

sgofferj commented 3 years ago

Here's a GPX file from a 16km hike last summer: https://cloud.gofferje.net/s/NskaEALN549Qa6n 1.4MB and a lot of points.

SamR1 commented 3 years ago

I don't see this error anymore with the update-dockerfile branch. Confirmed fixed.

great! thanks for your feedback.