Open LordMike opened 7 years ago
Thanks for the idea. We will look at that.
We have packed the planet file and it has 34 GB (out of 47 GB) with default gzip option.
Inside of .mbtiles the .pbf files are gzipped - we did not expect such a huge saving with additional compression - but based on your numbers it would make sense to turn it on indeed.
The problems are:
Static gzip with nginx: https://www.nginx.com/resources/admin-guide/compression-and-decompression/#send
Static gzip with s3:
It is possible to set up the gzip headers permanently on the cloud storage (our S3 equivalent) by uploading gzipped version of file and setting Content-Encoding: gzip
header.
Clients not able not gunzip would fail, but there is not many such clients anymore. In desktop web browsers this would run fine.
We have to verify how to correctly download gzipped files on the command line - ideally with continue option:
The wget -c
would probably not run. Only if people download gzipped version - and decompress after the download is finished (and they need extra disk space for this on their side).
Curl has: curl --compressed "http://example.com"
as in http://stackoverflow.com/questions/8364640/how-to-properly-handle-a-gzipped-page-when-using-curl - but not sure if this is compatible with resume (-C
continue) mode.
The behavior on command line must be verified first.
Hey,
I was downloading a file
mbtiles
files for a map, and noticed that the files were uncompressed. I checked with gzip, and was able to get a compression ratio of0.56
for f.ex. Chile and0.68
for Canada:Given that the files are static and rarely updated, wouldn't it be prudent to have them compressed?
By compressing them statically, and then setting the
Content-Transfer-Encoding
header, the end result for clients should be identical to what it is now (as browsers will decode them before saving ..). That, or just serve up the.gz
files so clients will know they're GZipped.