Closed cedricr closed 2 weeks ago
What failed is the new feature I added in 0.5.3. It crashed in delete_points
when freeing and reallocating the memory. I will fix it asap.
Is there a better way to see which file is currently processed, so that when there’s a crash,
Use exec(..., verbose = TRUE)
and run your script in a terminal (otherwise Rstudio won't last long enough to let you see what is printed)
Thanks a lot !
Fixed.
That being said, if I remove the normalize step, the pipeline doesn’t crash…
This is because without normalize
the pipeline can be streamed, i.e. 0 memory is allocated and points are processed one by one through the pipeline without loading the entire point cloud.
Oh that was fast ! Thank you so much.
I’m trying to extract vegetation data from the french IGN LidarHD data.
I’m running a simple pipeline on all tiles intersecting a city geometry, in order to generate a chm. In some cases, the pipeline crashes (hard crash on RStudio, with a "R Session Aborted" dialog which doesn't let me see any error messages. In VSCode, I get the following error:
Here’s a minimal repro:
What I think is happening: this tile is a mostly sea and beach, near Nice. There’s not vegetation (class 5) at all in it, so the
delete_points
makes it an empty point cloud. Indeed, if I keep class 2 instead of 5 everything works. That being said, if I remove thenormalize
step, the pipeline doesn’t crash…If I rewrite this pipeline using LidR
I get the following warning:
Interpolation of 1429 points failed because they are too far from ground points. Nearest neighbor was used but interpolation is weak for those points
, but no crash, and an empty point cloud as expected.Subsidiary question: in order to find the problematic file, I had to replace running a pipeline with lots of files by a for loop on the files, inside which I print the current file name, then run the pipeline on it. Is there a better way to see which file is currently processed, so that when there’s a crash, I can know what the problematic file is?