heremaps / tin-terrain

A command-line tool for converting heightmaps in GeoTIFF format into tiled optimized meshes.
MIT License
586 stars 127 forks source link

"Killed" ... Memory usage? #57

Closed wthorp closed 5 years ago

wthorp commented 5 years ago

I'm trying to run dem2tintiles via docker and see the message "Killed" very quickly when I run dem2tintiles on a fairly normal (16GB RAM) Windows PC:

`

docker run -v e:\:/data:cached --name tin-terrain --rm -m 8g --memory-swap 128g --oom-kill-disable -i -t tin-terrain bash `

root@364d9c5d1d3a:/home# /usr/local/bin/tin-terrain dem2tintiles --input /data/merit.tif --output-dir /data/output --min-zoom 5 --max-zoom 14 --output-format=terrain --max-error 2.0 Opening raster file /data/merit.tif with GDAL... reading raster data... Killed

My image is large (500gB): `

gdalinfo merit.tif Driver: GTiff/GeoTIFF Files: merit.tif merit.tif.ovr Size is 432000, 305835 Coordinate System is: PROJCS["WGS 84 / Pseudo-Mercator", ... Band 1 Block=256x256 Type=Float32, ColorInterp=Gray NoData Value=-9999 Overviews: 108000x76459, 27000x19115, 6750x4779, 1688x1195 `

I believe that I'm running out of memory, but what sort of memory usage should I expect? Is there any sort of DEM 'windowing' to limit memory usage? Is there another way to organize my data differently to not require absurd amounts of memory?

fester commented 5 years ago

You are correct, it's a memory issue. If you need a ballpark estimate of memory requirements then just assume that your raster must fit in memory. If you need to process rasters larger than memory, I can suggest splitting it into smaller tiles before meshing (i.e. with https://www.gdal.org/gdal2tiles.html). But of course it's not a convenient solution.

wthorp commented 5 years ago

I'm curious how others use this tool in the real world. If I cut my image into smaller parts, won't there be "seam lines" - areas where the terrain has odd lines due to it being the edge of the source DEMs? This 500gb run was going to be my "small" test, as I has hoped to do the same thing at ~1m instead of ~30m.

fester commented 5 years ago

And you are correct, splitting raster into chunks will produce artifacts at tile borders. However, visibility of these artifacts also depend on a lighting model you use to render the landscape. One more option is to rent one of these instances and do the work there.

Anyways, at this point processing larger-than-memory datasets is not supported. Unfortunately I can't even tell for sure that it is in a part of a development backlog (although it might be included there suddenly). I will close this issue for now.