Open utagawal opened 9 months ago
I am open to suggestions on how to make it more customizable and maybe have some inputs. for different sources usually I just make a copy of the script and tweak it, for example I had made one for GEBCO bathymetry which is basically the same with a few tweaks https://github.com/acalcutt/GEBCO_to_MBTiles/blob/main/create_terrainrgb.sh
The real trouble for me is the space each layer takes. if you want to do higher resolution, each zoom layer is basically 4x bigger than the one before it. With the jaxa dataset, zoom 11 looks pretty much as good as it will get, and that takes 138GB. just adding 1 level to zoom 12 brings that up to 427GB. I'd imagine going one more zoom level up to 13 is going to bring that file to at least 1TB, possibly 2TB
I do like the idea of using higher resolution with fallback to jaxa, but I have no idea how you would accomplish that. it seems like you would need layers or masking in the gdal vrt portion of the script, and i'm not sure that is an option. We had discussed something similar filtering by height at https://osmus.slack.com/archives/C01G3D28DAB/p1692976515173599 , and masking was a suggestion, which is similat to what this would need I think.
Agree with what you're saying but see below the difference on our sample over France, this looks really great and I'm really looking being able to offer such nice rendering for the planet even with storage needs are over a TB (as long as the script can do it, we can use cloud compute power to calculate it once and make it available (torrent ?) for everyone like planetiler does it for the OSM data in mbtile)
A couple of findings :
Zoom level | Resolution (meters/px) | Best dataset | Dataset native resolution (m/px) | Coverage | Dataset SRID | Alternate datasets (*) |
---|---|---|---|---|---|---|
1 | 55,346 | ETOPO1 | 1800 | Global + Bathymetry | 4326 | N/A |
2 | 27,673 | ETOPO1 | 1800 | Global + Bathymetry | 4326 | N/A |
0 | 13,837 | ETOPO1 | 1800 | Global + Bathymetry | 4326 | N/A |
3 | 6,918 | ETOPO1 | 1800 | Global + Bathymetry | 4326 | N/A |
4 | 3,459 | ETOPO1 | 1800 | Global + Bathymetry | 4326 | N/A |
5 | 1,730 | GEBCO_2019 | 464 | Global + Bathymetry | 4326 | N/A |
6 | 865 | GEBCO_2019 | 464 | Global + Bathymetry | 4326 | N/A |
7 | 432 | GEBCO_2019 | 464 | Global + Bathymetry | 4326 | N/A |
8 | 216 | NASADEM | 30 | Global | 4326 | N/A |
9 | 108 | NASADEM | 30 | Global | 4326 | N/A |
10 | 54 | NASADEM | 30 | Global | 4326 | N/A |
11 | 27 | IGN_5m | 5 | France | 2154 | SwissAlti2m, TINItaly, IGN_spain |
12 | 14 | IGN_5m | 5 | France | 2154 | SwissAlti2m, TINItaly, IGN_spain |
13 | 7 | IGN_5m | 5 | France | 2154 | SwissAlti2m, TINItaly, IGN_spain |
14 | 3.4 | IGN_1m | 1 | France | 2154 | SwissAlti2m, TINItaly, IGN_spain |
15 | 1.7 | IGN_1m | 1 | France | 2154 | SwissAlti50cm, TINItaly, IGN_spain |
16 | 0.8 | IGN_1m | 1 | France | 2154 | SwissAlti50cm, TINItaly, IGN_spain |
Maybe some ideas here : https://github.com/tilezen/joerd/blob/master/docs/data-sources.md
The real trouble for me is the space each layer takes. if you want to do higher resolution, each zoom layer is basically 4x bigger than the one before it. With the jaxa dataset, zoom 11 looks pretty much as good as it will get, and that takes 138GB. just adding 1 level to zoom 12 brings that up to 427GB. I'd imagine going one more zoom level up to 13 is going to bring that file to at least 1TB, possibly 2TB
I do like the idea of using higher resolution with fallback to jaxa, but I have no idea how you would accomplish that. it seems like you would need layers or masking in the gdal vrt portion of the script, and i'm not sure that is an option. We had discussed something similar filtering by height at https://osmus.slack.com/archives/C01G3D28DAB/p1692976515173599 , and masking was a suggestion, which is similat to what this would need I think.
Now with the Webp compression I guess it opens new opportunity size wise ?
Also to continue the discussion :
Another approach would be to keep the terrain data at Z12 and work on a hillshading mbtile till Z15 ?
I don't think I had ever mentioned it, but I did test making a swisstopo Z0-Z17 in terrarium format just to see if I could make it https://tiles.wifidb.net/data/swissalti3d_terrarium_0-17/#8/46.695/8.064 https://stackblitz.com/edit/web-platform-9f92n2?file=index.html
It looks pretty good up untill zoom 16/17 where it starts to get a bit blocky
I do wonder though if I went back to terrainrgb format and found some better defaults for base and interval if it would look better. maplibre support for other base/intervals has been added in https://github.com/maplibre/maplibre-style-spec/issues/326 . I also have not yet tested how it looks in just the default mapbox terrainrgb defaults, but it sounds like they are not good for high zoom levels
After that I also tried to combine TIN Italy with swisstopo with something like the following. this seemed like it was working but eventually the system froze on me so I never finished. I originally tried to use the tifs from both sets directly, but it was complaining of different projections, so I added a intermediate step of creating vrts in the projection I wanted, then combing them into a file list to combine into a combined vrt later.
#!/bin/bash
#Requires custom version of rio rgbify which adds terrarium encoding support ( https://github.com/acalcutt/rio-rgbify/ )
INPUT_DIR=./download
OUTPUT_DIR=./output
vrtfile=${OUTPUT_DIR}/swiss_italy_terrainrgb0-15.vrt
vrtfile2=${OUTPUT_DIR}/swiss_italy_terrainrgb0-15_warp.vrt
mbtiles=${OUTPUT_DIR}/swiss_italy_terrainrgb0-15.mbtiles
#vrtfile2=${OUTPUT_DIR}/swiss_italy_terrainrgb0-15_warp.vrt
[ -d "$OUTPUT_DIR" ] || mkdir -p $OUTPUT_DIR || { echo "error: $OUTPUT_DIR " 1>&2; exit 1; }
#rm rio/*
for i in /mnt/usb/tinitaly/input/*.tif;
do gdalwarp -of VRT $i ${i%%.*}_out.vrt -t_srs "EPSG:3857";
done;
for i in /mnt/usb/swisstopo_to_mbtiles/download/*.tif;
do gdalwarp -of VRT $i ${i%%.*}_out.vrt -t_srs "EPSG:3857";
done;
printf '%s\n' /mnt/usb/tinitaly/input/*.vrt >filenames.txt
printf '%s\n' /mnt/usb/swisstopo_to_mbtiles/download/*.vrt >>filenames.txt
gdalbuildvrt -overwrite ${vrtfile} -input_file_list filenames.txt
gdalwarp -r cubicspline -t_srs EPSG:3857 -dstnodata 0 -co COMPRESS=DEFLATE ${vrtfile} ${vrtfile2}
rio rgbify -e mapbox -b -10000 -i 0.1 --min-z 0 --max-z 15 -j 16 --format png ${vrtfile2} ${mbtiles}
Yes you have to normalize projection before doing that. That was probably not necessary using only one dataset, but using multiple sources will make that step mandatory.
You also have to do the dataset fusions at borders BEFORE the rendering takes place else it's generating unwanted artefacts.
Any possibiliyt to allow other inputs and make the config files more customizable ?
For example being able to specify DEMs with higher resolution (we have 5m now for most western countries and even LIDAR) and as a fallback use AW3D30 ?
This will certainly require to specify URL, file format and re-projection.
Challenge will be to be carefull on how to manage potential artefacts at tiles and country borders when multiple DEMS overlap and do not have the same resolution.