Open Firefishy opened 1 year ago
We can batch convert the existing files too if needed, but best to coordinate this.
It's kind of complicated to do because quite a lot of them are already compressed so we need to figure out which ones are not.
Perhaps cleaning up "sunnypilot" and "dragonpilot" gps traces would be sufficient to achieve the anticipated savings, as they correspond to ~56% of all gps trace points. All of their traces are uncompressed.
Cleaning up in this case could also mean to simply delete a fair amount of over-noded traces and avoid the "transfer out" traffic altogether. Other options are being discussed in the operations issue linked above.
Compressing new GPX files before sending them to S3 would have to be done in this repo somehow.
As noted elsewhere, we probably don't need to store the original gpx files in the first place. All relevant details should be in the gps_points table already. Or am I missing something here?
Problem
The source GPX traces are currently saved uncompressed in AWS S3 via Active Storage
We currently spend ~ $124 per month on storage and transfer-out (in is free) for GPX traces.
Description
We should compress GPX files when saving them in S3.
We would save 60%+ in fees if we GZIP'ed the files.
Screenshots
No response