Closed Jean-Roc closed 1 year ago
I think this might have more to do with the point record 8 than the 18 billion points. It seems like something is outside the bounds of uint8_t. Any chance you can post the file online somewhere?
Each voxel is divided up into some number of cells in each direction X/Y/Z. As points are sampled into those cells, a calculation is made to determine the cell index. This is failing because a point's index exceeds that which is defined:
At one point the number of cells was too large, which would cause this problem, so you should make sure that you have a version with a CellCount of 128, or someone should debug the math that calculates the actual number of cells in each direction of the voxel to make sure that it doesn't exceed 255.
@hobu here are the result and the sources, stats are available in a [ept.json](https://webimaging.lillemetropole.fr/externe/lidar/2018_scot/ept/ept.json
@abellgithub how could I modify to get this CellCount ?
This needs to be debugged. You need to make sure that the values here are <= 255:
https://github.com/hobuinc/untwine/blob/main/bu/VoxelInfo.hpp#L68-L70
If that's true, you need to verify that the points are in the bounds. You might verify your data to make sure that the bounds of your input matches the actual bounds of your data.
Just a little clarification about the pdal info return. Because the input must be named input.copc.laz , the return actually points out a header issue
pdal info --metadata path\input.copc.laz
PDAL: readers.copc: Invalid LAS header in COPC file
As pointed by Andrew, the error was based on non-valid laz chunks, we went back to source data to resolve it and produce the COPC (111Go !)
Last week I have processed ~1000 LAZ files to COPC (individually, not into a single COPC file), and I have seen this error for roughly 20 input files. The error seems to be a bit random - when I restart indexing of the failing files, indexing succeeds afterwards (in one case I had to restart indexing of a single file maybe 3x until it suceeded).
This is one of the files that failed (365mb): https://drive.google.com/file/d/195WeoI_DdONHWxAYxfV2ZNGpgmBJj8pk/view?usp=sharing
@hobu I would suggest to reopen this ticket (I am unable to do that)
I've identified an issue that can cause points to be placed in the wrong cell in rare circumstances, which can cause this error. See #131
Hi,
We tried to convert a laz tiles dataset of 18 billion points (point record 8) to a COPC using untwine :
this version was compiled from master and built against pdal 2.4.1
The process ends abruptly with this error :
Using pdal info returns :
The written output is ~100Go, the input was ~80Go (unordered by gpstimestamp), it would seems to have happened at the end of the conversion.
Could you tell me if we have reached an hard limit of the COPC/LAZ format ?
Regards, Jean-Roc