Closed wiesehahn closed 4 months ago
not sure if this is expected, but I just wanted to try this with included sample data and trying to execute the pipeline crashes R.
library(lasR)
pipeline <- reader_las() + normalize(extrabytes = T) + write_las()
f <- system.file("extdata", "las14_pdrf6.laz", package="lasR")
points <- lidR::readLAS(f)
ans <- exec(pipeline, on = f) # crashes!
What is the format of your file that is suppose to have HAG but does not? LAS 1.4?
Which version are you using? I cannot reproduce the crash.
What is the format of your file that is suppose to have HAG but does not? LAS 1.4?
yes, 1.4, you can see if you scroll down above.
Which version are you using? I cannot reproduce the crash.
It was crashing with 0.7.2. I updated to 0.9.1 now and it does not crash anymore. This is the result for me:
library(lasR)
f <- system.file("extdata", "las14_pdrf6.laz", package="lasR")
las1 <- templas()
pipeline1 <- reader_las() + write_las(ofile = las1)
exec(pipeline1, on = f)
#> [1] "C:\\Users\\JWIESE~1\\AppData\\Local\\Temp\\Rtmp2hEfsc\\file18058a70671e.las"
file.exists(las1)
#> [1] TRUE
las2 <- templas()
pipeline2 <- reader_las() + normalize(extrabytes = T) + write_las(ofile = las2)
exec(pipeline2, on = f)
#> NULL
file.exists(las2)
#> [1] FALSE
las3 <- templas()
pipeline3 <- reader_las() + normalize() + write_las(ofile = las3)
exec(pipeline3, on = f)
#> NULL
file.exists(las3)
#> [1] FALSE
Ok good, that was my thought, I already fixed it. However there are 0 ground points, thus no triangulation, thus no normalization. All the points are discarded. This is the only way to run an entire pipeline safely without stopping at e.g. 90% because a single file, e.g. in a lake, has 0 ground points. I should add a warning tho.
Now the problem of HAG is likely caused by LAS 1.4 format. I will check on next tuesday. Can you share a file.
TLDR: if your file is LAS compliant this should be fixed in LASlib
but I'm pretty sure you file is not compliant with the specifications.
I found the issue. Honestly I don't know if it should be fixed on my side or on your side. I'm not sure if your files is compliant with the specifications. And if it is, it must be fixed on the LASlib
side i.e in LAStools
.
The header of your original lassample.las
has 3 extra bytes definitions for a total of 8 extra attributes.
variable length header record 2 of 5:
reserved 0
user ID 'LASF_Spec'
record ID 4
length after header 768
description 'RIEGL Extra Bytes'
Extra Byte Descriptions
data type: 3 (unsigned short), name "Amplitude", description: "Echo signal amplitude [dB]", min:, 0, max:, 10000, scale: 0.01, offset: 0 (not set)
data type: 3 (unsigned short), name "Pulse width", description: "Full width at half maximum [ns]", min:, 1, max:, 10000, scale: 0.1, offset: 0 (not set)
data type: 4 (short), name "Reflectance", description: "Echo signal reflectance [dB]", min:, -5000, max:, 15000, scale: 0.01, offset: 0 (not set)
data type: 3 (unsigned short), name "Deviation", description: "Pulse shape deviation", min:, 0, max:, 65535, scale: 1 (not set), offset: 0 (not set)
variable length header record 3 of 5:
reserved 0
user ID 'LASF_Spec'
record ID 4
length after header 192
description ''
Extra Byte Descriptions
data type: 1 (unsigned char), name "confidence", description: "confidence values", scale: 1 (not set), offset: 0 (not set)
variable length header record 4 of 5:
reserved 0
user ID 'LASF_Spec'
record ID 4
length after header 576
description 'TerraScan Extra Bytes'
Extra Byte Descriptions
data type: 6 (long), name "Distance", description: "Distance", min:, 2147483649, max:, 2147483647, scale: 1, offset: 0 (not set)
data type: 5 (unsigned long), name "Group", description: "Group", min:, 0, max:, 4294967295, scale: 1, offset: 0 (not set)
data type: 5 (unsigned long), name "Normal", description: "Normal vector 2+15+15 bits", min:, 0, max:, 0, scale: 1, offset: 0 (not set)
When read with LASlib
, only the last definition is retained becauseonly one definition is allowed (this should be checked with the ASPRS group).
Thus, when read with lidR
or with lasR
or LAStools
it sees only 3 extrabytes. According to the header it should see Distance
Group
Normal
and this is the column names you get with lidR::readLAS()
but BE CAREFUL !!, the actual data loaded are corrupted because the actual extra bytes are not matching the description. The actual 3 first extra-byte are likely Amplitude
Pulse width
Reflectance
which are of different types.
In short, you file is not valid and this propagates to the output file.
Alright, but the problems with my file are not related to your testfile I guess, which does not crash R in recent versions but it also does not create output. Not sure if this is related to this issue though.
library(lasR)
f <- system.file("extdata", "las14_pdrf6.laz", package="lasR")
las3 <- templas()
pipeline3 <- reader_las() + normalize() + write_las(ofile = las3)
exec(pipeline3, on = f)
#> NULL
file.exists(las3)
#> [1] FALSE
f <- system.file("extdata", "Topography.las", package="lasR")
las4 <- templas()
pipeline4 <- reader_las() + normalize() + write_las(ofile = las4)
exec(pipeline4, on = f)
#> [1] "C:\\Users\\JWIESE~1\\AppData\\Local\\Temp\\RtmpG6ShkM\\file1544717452e4.las"
file.exists(las4)
#> [1] TRUE
No output for las14_pdrf6
is expected. No ground point -> no triangulation -> no normalization possible. A 0 point file is created but 0 point files are not actually saved. This is the best behavior and an error is not an option because it could be the file 995/1000 and you don't want the pipeline to stop at 99.5% because one file has no ground points (e.g. in a lake).
As I said earlier the best I can do it to add a warning.
According to the discussion on ASPRS repo your file is valid but most reader on the market won't read it properly. I modified LASlib
to concatenate the multiple VLRs. It is read properly now (I guess).
Your example:
f <- "lassample.las"
ans <- exec(pipeline, on = f)
points <- lidR::readLAS(ans)
summary(points@data$HAG)
works because ans
is a repaired version of f
. To read f
properly with lidR
you must update rlas
where I also fixed the issue.
While adding the normalized height data as extra attribute
HAG
works on the samplefile, it does not add the attribute to my data. Is it possibly related to the point data record format?!