Open DerKorb opened 1 year ago
I have the same issue. Did you find a way to solve it?
I am using potree converter and gocesiumtiler by now. Both have their own problems thoughm gocesiumtile can't handle datasets above a certain size and potree is awesome, but its a own format that does not work with cesium.
I had the same issue, and fixed it by adjusting the max heap size of node, using the --max-old-space-size flag when starting the node process.
On windows, I did it by first location the source of ept
Get-Command -Name "ept"
which returned C:\Program Files\nodejs\ept.ps1
. I then went on to give node an ample 24Gb of heap, by modifying the script( C:\Program Files\nodejs\ept.ps1) as follows:
#!/usr/bin/env pwsh
$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent
$exe=""
if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) {
# Fix case when both the Windows and Linux builds of Node
# are installed in the same directory
$exe=".exe"
}
$ret=0
if (Test-Path "$basedir/node$exe") {
# Support pipeline input
if ($MyInvocation.ExpectingInput) {
$input | & "$basedir/node$exe" --max-old-space-size=24576 "$basedir/node_modules/ept-tools/lib/app/index.js" $args
} else {
& "$basedir/node$exe" --max-old-space-size=24576 "$basedir/node_modules/ept-tools/lib/app/index.js" $args
}
$ret=$LASTEXITCODE
} else {
# Support pipeline input
if ($MyInvocation.ExpectingInput) {
$input | & "node$exe" --max-old-space-size=24576 "$basedir/node_modules/ept-tools/lib/app/index.js" $args
} else {
& "node$exe" --max-old-space-size=24576 "$basedir/node_modules/ept-tools/lib/app/index.js" $args
}
$ret=$LASTEXITCODE
}
exit $ret
Note1: that 24Gb is maybe a bit of an overkill, I just was lucky to have lots of RAM, so I used to it just to be safe. I monitored the child processes, and they collectively used about 2Gb at the worst point, while tiling a ~5Gb EPT. So adjust this as you please!
Note2: it would be useful to allow for arguments to be passed to node when invoking ept, or include a special argument specifically for the max-heap (?)
I get a heap out of memory error. I already tried to increase it to 25Gb, but it still does not work. I generated the dataset with entwine and it has around ~50GB in ~150k files. Is that just to much, is there some way I can control conversion so it wont run out of memory?