CloudCompare / CloudComPy

Python wrapper for CloudCompare
Other
283 stars 40 forks source link

Intensity problem #64

Open Masrhalsp opened 2 years ago

Masrhalsp commented 2 years ago

Hello, first I would like to express my gratitude for adding the ccSensors attribute for geting the tansformation data from e57.

As I was using the codes, I have encountered something strange by converting the Scalerfield(intensity) to Numpy Array, about the last 50 elements of the array become zero, although when I open the same data with Pye57, there is no zero in the last elements. Thanks in advance.

prascle commented 2 years ago

Hello, I think this is related to invalid points. It seems that when CloudCompare reads a file with invalid points, the scalar fields arrays are resized for all the points, then the cloud is resized to keep only the valid points, but the scalar fields keep the original size. In the case of invalid points, the scalarField should be resized with sf.resizeSafe(cloud.size(), False, 0.0)

A test with some E57 files from http://www.libe57.org/data.html

import os
import sys
import math

os.environ["_CCTRACE_"] = "ON"  # only if you want C++ debug traces

from gendata import dataDir, dataExtDir, isCoordEqual

import cloudComPy as cc

listFiles=["manitou.e57",
           "manitouNoInvalidPoints.e57",
           "pumpARowColumnIndex.e57",
           "pumpARowColumnIndexNoInvalidPoints.e57",
           "pump.e57",
           "pumpNoInvalidPoints.e57"
           ]

for f in listFiles:
    print("===================================== ",f)
    entities = cc.importFile(os.path.join(dataExtDir,f))
    for entity in entities[1]:
        print("-------------",entity.getName(), "size", entity.size())
        sf = entity.getScalarField(0)
        print("sf.currentSize()", sf.currentSize())
        if (sf.currentSize() > entity.size()):
            asf = sf.toNpArray()
            print("invalid elements - asf.shape", asf.shape)
            print(asf[entity.size()-20:entity.size()+20 ])
            sf.resizeSafe(entity.size(), False, 0.0)
            sf.computeMinAndMax()
        asf = sf.toNpArray()
        print("asf.shape", asf.shape)
        print(asf[entity.size()-20:])
dgirardeau commented 2 years ago

Hi Paul,

Which version are you based on? Because I believe this issue has been fixed some time ago?

When 'resize' is called on a point cloud, all scalar fields should normally be resized as well: https://github.com/CloudCompare/CCCoreLib/blob/master/include/PointCloudTpl.h#L240

prascle commented 2 years ago

Hi Daniel, I am based on CloudCompare master (July 14, 2022, sha1: 1118e63344e08) plus my patches. It is indeed strange, the corresponding GUI does not show any problem with the scalarFields. I will check on my side...

prascle commented 2 years ago

@dgirardeau, I check with the CloudCompare 2.12.4 Windows binary, with the manitou.e57 file from http://www.libe57.org/data.html (with invalid points). When I export the Intensity scalar field histogram in a csv, I find a lot of points (~800000) in the first class near 0, although the cloud size is about 346000. Paul

prascle commented 2 years ago

@Masrhalsp

I forgot the call to sf.computeMinAndMax() after sf.resizeSafe(entity.size(), False, 0.0) in my example above. Paul

dgirardeau commented 2 years ago

@prascle I found the issue... the PointCloudTpl::resize method was not always called 😅

I fixed it (normally): https://github.com/CloudCompare/CloudCompare/commit/a5d613fe15b7acd0ad0423ff85f3836e63898774