potree / PotreeConverter

Create multi res point cloud to use with potree
http://potree.org
BSD 2-Clause "Simplified" License
681 stars 424 forks source link

Pb to convert a *.las file convert from a *.ptx file with CloudCompare #438

Open Thomas62149 opened 4 years ago

Thomas62149 commented 4 years ago

Hi Markus,

I'm testing PotreeConverter 2.01 on a large series of .las files. It works great for a big majority of them but Impossible to convert this .las file with PotreeConvert 2.01. PotreeConverter seems to fail in the last "Indexing" operation (it write "0% INDEXING" for 30 second in the console and then close. The .ptx file is exported by leica software "Cyclone register 360" from a BLK scanning. Here is the link to download the .las file : "https://www.grosfichiers.com/u5mcnndZsPC". I've tried to direct convert the .ptx with PotreeConvert 2.01 but it crashes the .exe at the beginninbg of the analyse of the File.

Thanks for your answer

Best regard

Thomas MARTEL

smcavoy12 commented 4 years ago

Hi Thomas,

I tested, it failed. I used LAStools lasduplicate64, removed 3080 duplicate points, then it ran fine.

I've been having having to do this with most of my TLS datasets now, it's now a standard part of my process, prepping files for potree. e57 to seperate las scans, run lasduplicate on the whole directory.

midnight-dev commented 4 years ago

Did this workload normally succeed without issue on 1.7? If this is more of a workaround than it is a workflow, then this could be a big red flag that there's been a regression in program behavior. To be fair, a new major version - especially a total rewrite - is expected to be different.

I believe the intended behavior is now to remove the data of confirmed duplicates, no matter the amount; however, I think it's only partially implemented. Need to revisit this after Markus confirms duplicates are gracefully handled.

smcavoy12 commented 4 years ago

@midnight-dev on 1.7 it stalls up, stuck on "READING" for 20 minutes, then indexing the first 1 million. Very slow, I didn't let it go farther. On other files with small numbers of duplicates the behavior I've observed is MASSIVE slowdown, but eventual completion. I can't provide specific examples, but I'll save them from here on out if its useful to you.

Thomas62149 commented 4 years ago

Thanks for your answers. The duplicates point are the problem. Markus is working on that type of problems to make PotreeConverter more robust. I use the command line "-SS SPATIAL 0.001" in cloud compare to sub sample the cloud and then the integration of the las in Potree works perfectly. Thanks

m-schuetz commented 4 years ago

I've made a build with a fix here: http://potree.org/temporary/PotreeConverter_drop_duplicates_2020.09.09.zip

If you had problems with ERROR due to duplicates, please give it a try and let me know if it works. Note that this fixes cases where there are tens of thousands of duplicates, perhaps even a couple of millions. It likely won't work if there are tens of millions of points at the exact same coordinate, in which case you'll still have to sanitize the data before using the converter.

FrenchTrott commented 4 years ago

Hi Markus, it worked on our end, on 3 problematic files.

Thanks !

When you talk about sanitizing a file, what's the best method you think?

m-schuetz commented 4 years ago

When you talk about sanitizing a file, what's the best method you think?

You could use Cloud Compare command line mode, as suggested by @Thomas62149

I use the command line "-SS SPATIAL 0.001" in cloud compare to sub sample the cloud and then the integration of the las in Potree works perfectly.

FrenchTrott commented 4 years ago

Yes but CC uses as much RAM as the file size .... If the file is 300 GB, it's complicated ... Do you have another idea?

m-schuetz commented 4 years ago

Do you only have a single large 300GB file? Aren't there tiled files or separate scan locations? If you only have the 300GB file, you might have to split it into smaller tiles, and apply the subsampling operation on each file. I'm not sure if Cloud Compare can partition a point cloud into tiles, but lastiles can.

m-schuetz commented 4 years ago

You could also give PDAL a try. Perhaps thinning might help. Generally, any subsampling/thinning operation might also drop points that were okay, but if you set the minimum distance/radius/spacing arguments reasonably low, there is a good chance that only duplicates or almost only duplicates will be removed.

FrenchTrott commented 4 years ago

Thanks Markus.

This last update, of this issue, will be integrated when?