Closed whit0348 closed 5 years ago
It would also be cool to operate on disk to save memory.
Hi,
Your suggestion is unclear and is subject to several interpretation. Let me try to answer as a function of my interpretations
Do you mean normalizing at read time? This is time consuming. Are you expecting to normalized your point cloud each time you read it? In don't see any interest of that. Time is expensive, storage is cheap. If it is really what you want you can easily define your own read
function like:
readnormalized = function(....) {
las = readLAS(....)
las = lasnormalize(las, tin())
return(las)
}
Do you mean streaming normalization? Streaming algorithms consist in processing the point cloud while reading it and writing immediately the result of each point sequentially into a file. That way the program does not load the whole point cloud it only need to store a small buffer of data required for the processing.
lidR
, catalog_retile
and lasclip
are streamed but I'm not able to write a streamed normalization. To hard to me. If you want blazingly efficient streamed tools you can use lastools
. lidR
is an R package and consequently relies on "in memory" tools by design because (1) user must be able to interact with the data in a classical R way (2) it relies also on third party packages that includes "in memory" algorithms. The goal of lidR
is to "explore". It is a flexible and easy to tune toolbox compared to lastools
but will never try to beat it in term of efficiency.Do you mean storing the normalized along with the non-normalized point? This is already possible in extrabytes but you must write your own tool and then use catalog_apply
to process several tiles using a buffer and so on. Here the basics:
las = readLAS(...)
las = lasnormalize(las, tin())
las = lasaddextrabyte(las, name = "Zref", desc = "Raw elevation")
writeLAS(las, ...)
It would also be cool to operate on disk to save memory.
It depends what do you mean but it is much much harder for a point cloud than for a raster because of the absence of data structure. Again the goal of lidR is to allows user to develop their idea easily. This ease of use have a cost.
I'm really keen to discuss about these questions. I'd like to improve the package a much as possible. So do not hesitate to define more accurately what do you mean and what are your expectations. But keep in mind that we are working within the R environment and, if it is technically possible to reproduce lastools
in R, it is not my goal and I'm anyway not skilled enough to achieve this task.
Great tool, but it would be quite nice for some algorithms to have an option to normalize on the fly, with an option to store normalized las files. It would save tons of disk space needed to store normalized las catalogs in order to run functions like grid_canopy or grid_metrics, and RAM when running individual tiles. If there is currently a workaround, please let me know!