connormanning / entwine

Entwine - point cloud organization for massive datasets
https://entwine.io
Other
451 stars 128 forks source link

default scale value not 1 #272

Closed julienlau closed 1 year ago

julienlau commented 2 years ago

Hi,

I don't use the scale option, but on my cases running without this option gives me a default scale = 0.01 Is there some kind of heurisitics determining an optimal scale ? If not, why not keeping default scale=1 ?

Dimensions: [
        X:int32, Y:int32, Z:int32, Intensity:uint16, ReturnNumber:uint8,
        NumberOfReturns:uint8, ScanDirectionFlag:uint8, EdgeOfFlightLine:uint8,
        Classification:uint8, ScanAngleRank:float32, UserData:uint8,
        PointSourceId:uint16, GpsTime:float64
]
Points: 117,446,806
Bounds: [xxx]
Scale: 0.01
SRS: none

Regards

defaultbranch commented 2 years ago

@julienlau

scale is possibly a misnomer, coming from the LAS/LAZ standards, and --precision might have been a clearer name for this Entwine option. For point coordinates in meters, the default 0.01 would then translate to centimeter accuracy, which seems reasonable to me.

See the description in https://entwine.io/configuration.html#absolute, or see https://github.com/connormanning/entwine/blob/master/entwine/io/laszip.cpp for the implementation; as far as I understood, Entwine scales numbers to integer types for LAZ compression, and in that case the scale factor determines the precision. But I am a new user of Entwine and may be wrong…

…can somebody confirm or correct my statement?