mauriceqch / pcc_geo_cnn

Learning Convolutional Transforms for Point Cloud Geometry Compression
MIT License
46 stars 12 forks source link

./mesh_to_pc.py ../data/ModelNet40 ../data/ModelNet40_pc_64 --vg_size 64 gives error #3

Open pedrogarciafreitas opened 5 years ago

pedrogarciafreitas commented 5 years ago

I've downloaded the ModelNet40 and extracted it on ../data/ModelNet40.

After executing ./mesh_to_pc.py ../data/ModelNet40 ../data/ModelNet40_pc_64 --vg_size 64 , I got the error:

File "/opt/conda/Anaconda3/lib/python3.7/multiprocessing/pool.py", line 121, in worker
    result = (True, func(*args, **kwds))
  File "./mesh_to_pc.py", line 39, in process
    pc = pc_mesh.get_sample("mesh_random", n=args.n_samples, as_PyntCloud=True)
  File "/opt/conda/Anaconda3/lib/python3.7/site-packages/pyntcloud-0.1.2-py3.7.egg/pyntcloud/core_class.py", line 534, in get_sample
    sample = sampler.compute()
  File "/opt/conda/Anaconda3/lib/python3.7/site-packages/pyntcloud-0.1.2-py3.7.egg/pyntcloud/samplers/mesh.py", line 67, in compute
    np.arange(len(areas)), size=self.n, p=probabilities)
  File "mtrand.pyx", line 793, in numpy.random.mtrand.RandomState.choice
ValueError: probabilities contain NaN```

The full error message:

./mesh_to_pc.py ../data/ModelNet40 ../data/ModelNet40_pc_64 --vg_size 64 2019-10-25 08:46:30.595 INFO mesh_to_pc - : Found 12311 models in ../data/ModelNet40 0%| | 0/12311 [00:00<?, ?it/s]/opt/conda/Anaconda3/lib/python3.7/site-packages/numpy/linalg/linalg.py:2512: RuntimeWarning: overflow encountered in multiply s = (x.conj() x).real /opt/conda/Anaconda3/lib/python3.7/site-packages/pyntcloud-0.1.2-py3.7.egg/pyntcloud/samplers/mesh.py:65: RuntimeWarning: invalid value encountered in true_divide probabilities = areas / np.sum(areas) 0%|▎ | 28/12311 [00:06<2:47:19, 1.22it/s]multiprocessing.pool.RemoteTraceback: """ Traceback (most recent call last): File "/opt/conda/Anaconda3/lib/python3.7/multiprocessing/pool.py", line 121, in worker result = (True, func(args, **kwds)) File "./mesh_to_pc.py", line 39, in process pc = pc_mesh.get_sample("mesh_random", n=args.n_samples, as_PyntCloud=True) File "/opt/conda/Anaconda3/lib/python3.7/site-packages/pyntcloud-0.1.2-py3.7.egg/pyntcloud/core_class.py", line 534, in get_sample sample = sampler.compute() File "/opt/conda/Anaconda3/lib/python3.7/site-packages/pyntcloud-0.1.2-py3.7.egg/pyntcloud/samplers/mesh.py", line 67, in compute np.arange(len(areas)), size=self.n, p=probabilities) File "mtrand.pyx", line 793, in numpy.random.mtrand.RandomState.choice ValueError: probabilities contain NaN """

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "./mesh_to_pc.py", line 86, in list(tqdm(p.imap(process_f, files), total=files_len)) File "/opt/conda/Anaconda3/lib/python3.7/site-packages/tqdm/std.py", line 1081, in iter for obj in iterable: File "/opt/conda/Anaconda3/lib/python3.7/multiprocessing/pool.py", line 748, in next raise value ValueError: probabilities contain NaN 0%|▎ | 28/12311 [00:06<48:44, 4.20it/s]

mauriceqch commented 5 years ago

Hi Pedro,

It seems that there is an overflow when computing areas to obtain the sampling probabilities.

To help you with your issue, can you give me more information about your configuration (OS, package versions) and where you downloaded the dataset. It would also help if you can rerun the script with the parallel processing disabled (by removing the call to pool.imap). That way, you will be able to run the script with pdb and pinpoint the exact point cloud causing the issue.

xtorker commented 4 years ago

cone/train/cone_0117.off curtain/train/curtain_0066.off

These two meshes causing this issue. The value of the vertice is too large, causing the overflow during sampling.

mauriceqch commented 4 years ago

This is surprising as I did not have this issue in my environment. One possible cause could be the version of numpy.

Could you provide the output of the following ?

import numpy as np
a = np.zeros((2,2))
print(a.dtype)

It should output float64 on recent numpy versions.

A possible solution would be to patch the pyntcloud sampler to avoid this overflow issue. Replace

probabilities = areas / np.sum(areas)

by

max_area = np.max(areas)
areas = areas / max_area
probabilities = areas / np.sum(areas)

(areas / max_area) / np.sum(areas / max_area) should avoid the overflow issue and be equal to areas / np.sum(areas).

xtorker commented 4 years ago

The output is float64 as expected.

The replacement didn't work, maybe the problem is in numpy.linalg.

BTW, what's your numpy version? I have experience with and without this problem. But I don't really remember what version of numpy I have used.

mauriceqch commented 4 years ago

I am using numpy 1.17.4 and I have reproduced the issue. I issued a fix in commit e0862279ae0bf5fce9190403f9419a7e3b58910a.

Essentially, you can add the following lines right after pc_mesh = PyntCloud.from_file(ori_path) to fix the overflow issue:

    mesh = pc_mesh.mesh
    pc_mesh.points = pc_mesh.points.astype('float64', copy=False)
    pc_mesh.mesh = mesh

This enforces the use of float64 in PyntCloud to avoid overflow. We first backup the mesh and re-set it later as the points setter removes the mesh.

pedrogarciafreitas commented 4 years ago

Hi guys,

import numpy as np
a =  np.zeros((2,2))
print(a.dtype)

is float64 as expected.

np.__version__ is 1.18.1.

I got the same error.

mauriceqch commented 4 years ago

Hi @pedrogarciafreitas,

Have you tried the patched version of the repository ?

xtorker commented 4 years ago

I am using numpy 1.17.4 and I have reproduced the issue. I issued a fix in commit e086227.

Essentially, you can add the following lines right after pc_mesh = PyntCloud.from_file(ori_path) to fix the overflow issue:

    mesh = pc_mesh.mesh
    pc_mesh.points = pc_mesh.points.astype('float64', copy=False)
    pc_mesh.mesh = mesh

This enforces the use of float64 in PyntCloud to avoid overflow. We first backup the mesh and re-set it later as the points setter removes the mesh.

This works. Thanks for your reply.

mauriceqch commented 4 years ago

Thanks for letting me know!