Closed wthorp closed 5 years ago
You are correct, it's a memory issue. If you need a ballpark estimate of memory requirements then just assume that your raster must fit in memory. If you need to process rasters larger than memory, I can suggest splitting it into smaller tiles before meshing (i.e. with https://www.gdal.org/gdal2tiles.html). But of course it's not a convenient solution.
I'm curious how others use this tool in the real world. If I cut my image into smaller parts, won't there be "seam lines" - areas where the terrain has odd lines due to it being the edge of the source DEMs? This 500gb run was going to be my "small" test, as I has hoped to do the same thing at ~1m instead of ~30m.
And you are correct, splitting raster into chunks will produce artifacts at tile borders. However, visibility of these artifacts also depend on a lighting model you use to render the landscape. One more option is to rent one of these instances and do the work there.
Anyways, at this point processing larger-than-memory datasets is not supported. Unfortunately I can't even tell for sure that it is in a part of a development backlog (although it might be included there suddenly). I will close this issue for now.
I'm trying to run dem2tintiles via docker and see the message "Killed" very quickly when I run dem2tintiles on a fairly normal (16GB RAM) Windows PC:
`
root@364d9c5d1d3a:/home# /usr/local/bin/tin-terrain dem2tintiles --input /data/merit.tif --output-dir /data/output --min-zoom 5 --max-zoom 14 --output-format=terrain --max-error 2.0 Opening raster file /data/merit.tif with GDAL... reading raster data... Killed
My image is large (500gB): `
I believe that I'm running out of memory, but what sort of memory usage should I expect? Is there any sort of DEM 'windowing' to limit memory usage? Is there another way to organize my data differently to not require absurd amounts of memory?