r-lidar / rlas

R package to read and write las and laz files used to store LiDAR data
https://cran.r-project.org/package=rlas
GNU General Public License v3.0
34 stars 14 forks source link

Problem lasnormalized LASlib #55

Closed THLAB35 closed 2 years ago

THLAB35 commented 2 years ago

Hi,

I managed to create a dtm with no problem. On the other hand, when I want to create a dsm, I encounter a problem from the start and this error code appears:

cannot open file "chemin d'accès" for write
ERROR: cannot open laswriterlas with file name
LASlib internal error. See message above.
In addition: Warning message:
lasnormalize is deprecated. Use normalize_height instead. 

Thank you in advance for all the help you could give me.

Jean-Romain commented 2 years ago

Please provide a minimal reproducible example. Error without the code that generated it does not help.

Jean-Romain commented 2 years ago

First lasnormalize is deprecated since 17 months. See https://github.com/Jean-Romain/lidR/releases/tag/v3.0.0

In addition: Warning message:
lasnormalize is deprecated. Use normalize_height instead. 

But this does not explain the error. I need to see more.

I see

cannot open file "chemin d'accès" for write

What is chemin d'accès. Show me the files names.

head(las_cat_A2$filename)
THLAB35 commented 2 years ago

head(las_cat_A2) File.Signature File.Source.ID GUID Version.Major Version.Minor System.Identifier 1 LASF\n 10 00000000-0000-0000-0000-000000000000 1 2 LAStools (c) by rapidlasso GmbH Generating.Software File.Creation.Day.of.Year File.Creation.Year Header.Size Offset.to.point.data Number.of.variable.length.records 1 Global Mapper 248 2021 227 644 3 Point.Data.Format.ID Point.Data.Record.Length Number.of.point.records X.scale.factor Y.scale.factor Z.scale.factor X.offset 1 3 34 286543893 0.001 0.001 0.001 256323.2 Y.offset Z.offset Max.X Min.X Max.Y Min.Y Max.Z Min.Z CRS Number.of.1st.return Number.of.2nd.return 1 5362824 0 257272.7 256323.2 5363904 5362824 183.905 117.89 2949 252261543 34282350 Number.of.3rd.return Number.of.4th.return Number.of.5th.return 1 0 0 0 filename C:\Travail\LiDAR\S21088\S21088_PointCloud_Classified_A2.las

Jean-Romain commented 2 years ago

Please copy paste what I wrote, not only the half

Jean-Romain commented 2 years ago

C:Travail/LiDAR/S21088_SNC_Jonquiere/las_out/norm_A2/norm_256000_5362500.las is not a valid path on Windows.

Jean-Romain commented 2 years ago

Is it possible that you show me the paths with head(las_cat_A2$filename) ?!

Jean-Romain commented 2 years ago

You have only one file ?

That look like an encoding issue. Please try

rlas::read.las(las_cat_A2$filename)
Jean-Romain commented 2 years ago

So it is a problem with lidR there is somewhere a badly encoded string. Try

cl = catalog_makechunks(las_cat_A2)
readLAS(cl[[1]])
THLAB35 commented 2 years ago

class : LAS (v1.2 format 3) memory : 25.6 Gb extent : 256323.2, 257272.7, 5362824, 5363904 (xmin, xmax, ymin, ymax) coord. ref. : NAD83(CSRS) / MTM zone 7 area : 0.8 km² points : 286.54 million points density : 359.85 points/m²

Jean-Romain commented 2 years ago

I don't know what to say. Please try to make a reproducible example

opt_output_files(las_cat_A2) <- "C:/las_out/norm_A2/norm_{XLEFT}_{YBOTTOM}"
lasnorm_A2 <- lasnormalize(las_cat_A2, dtm_A2)

Is not nought. What did you do before that ? Show me summary of las_cat_A2 ? Try to make something reproducible.

Actually I'm lost in your issue because I asked you many time the same info before to get something, you gave me partial answers, then you changed your answers and so on.

Jean-Romain commented 2 years ago

Your file is absolutely unreadable. Code, outputs, errors, comments, everything is mixed-up without any coloring and frame. There are several calls to unrelated packages (e.g. tmap), The last thing I see is

In dir.create(dir, recursive = TRUE) :
  cannot create dir 'D:\lidar_out', reason 'Permission denied'

Which is an error on your side. If you do not have rights to write in D: do not write in D:.

Notice that you are showing me another new error never mentioned earlier. I'm sorry but I give up