Closed kylebarron closed 4 years ago
After https://github.com/nst-guide/create-database/commit/216c11f422cc4ed566e456a5cadcfa9f6338d8fd I think I'm pretty much done with the cutting code.
I made high res (512px) tiles from all the fstopo maps. Granted, I kept all files within 20-miles of the trail, which is obviously way too much, but still, it's a lot of files:
> du -csh * | sort -h
4.0K tilemapresource.xml
8.0K leaflet.html
12K openlayers.html
16K googlemaps.html
44K 4
136K 5
488K 6
1.8M 7
7.1M 8
28M 9
106M 10
404M 11
1.5G 12
4.7G 13
11G 14
17G 15
35G total
Let's say that I only host up to zoom 14, and I leave out 80% of the tiles. That's still 3.6GB of tiles for the trail, and in five sections, that's 700MB per section of the trail. I know high-res stuff is beautiful, but this is why vector tiles exist, so that you can have beautiful high-res things without making people download 700MB of data.
Also, including to level 15, there are 300,000 files here. I guess that is still only $1.5 to PUT all those files, but only do it if it's actually going to be worth it.
> tree | wc -l
308119
Fixed in https://github.com/nst-guide/fstopo
I forget where I am with US forest topo maps. But in any case I need to figure out the gdal commands to cut them into seamless tiles. Do I need to create a metadata.json file for these raster tiles?