Closed wiesehahn closed 2 years ago
Ok I am sorry for the issue and not being patiently enough.
After running the script again and waiting a little bit longer this time, files are being written again after a while. Hence, the problem was that all previous tiles could not be written (several hundred tiles in the west of the entire area cover shoreline areas where my metric could not be calculated).
(Although I am still unsure if this was the case in previous attempts too.)
No idea and your explanation is hard to grasp. It is unclear when the script writes nothing and when it misses to write some random files. If I understand well (which is not the case) when you do not assign an attribute processed
it works well but if you assign an attribute processed
it starts missing files randomly. I am right ?
Your script looks perfect at first glance.
If I understand it right opt_stop_early(ctg) <- FALSE makes that the process is ongoing for files without errors. So if this warning would be the reason no files are written,
I don't think the warning is critical for lidR if it comes from myMetrics
but is critical in your code because it means there is an unexpected NA somewhere. If it comes from lidR however it is a big issue and I need a reproducible example.
I will close this issue as it is working currently and reopen it if I experience this issue again and it is not related to the files.
Not related at all but using opt_*
inside a function is often a bad practice. Actually you do it the way you want because your are the user and the developer. But when the users are not the developer opt_*()
functions are expected to be called by users on the user side and not by the developer. In summary, do not package your code as is for other users :wink:
Hey, When I try to process a LAScatalog it happens from time to time that the catalog seems to be processed (CPU and Memory are used and progress bar counts upwards for processed files), however no files are written to the specified output path. This seems to appear quite randomly, sometimes after restarting the R-session the exactly same script resulted in files written to disk.
At least this time it might be related to the files themselfe as I define which files should be processed and currently no files are written when I try to continue processing the catalog (I tried it several times). But when I skip this part of the script and instead start processing the entire catalog it begins with writing files to disk. However, I am pretty sure (although not 100%) that the previous times it was not related to certain files, as no files where written in the first attempt and the process succeeded for the same files in the second or third attempt.
So at this time I get
Warning messages: 1: In min(x) : no non-missing arguments to min; returning Inf
which somehow results from my functionIf I understand it right
opt_stop_early(ctg) <- FALSE
makes that the process is ongoing for files without errors. So if this warning would be the reason no files are written, the engine should write a file once it processes a file without error again, right? I did not wait that long until now, but since several hundred files were processed without a file being written I assumed this was not the reason.Sorry, its hard to explain and make it reproducible but maybe there are ideas why no files are written?