fwilliams / neural-splines

Official Implementation of Neural Splines
MIT License
90 stars 6 forks source link

Using too high min-pts-per-cell leads to ValueError #3

Open nvibd opened 3 years ago

nvibd commented 3 years ago

Here's the full command line and resulting error. I have attached the input .PLY file.

$ python fit-grid.py '/home/user/Documents/meshing_comp/input_files/einstein.ply' 10_000 128 8 --min-pts-per-cell 1000
Using random seed 2682126288
Downsampling input point cloud to voxel resolution.
Fitting 7549 points using 512 cells
100%|████████████████████████████████████████████████████████████████████████████████████████████████████████████████▊| 511/512 [00:00<00:00, 2456.00it/s, Cell=(7, 7, 7), Num Points=0]fit-grid.py:202: DeprecationWarning: `np.bool` is a deprecated alias for the builtin `bool`. To silence this warning, use `bool` by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use `np.bool_` here.
Deprecated in NumPy 1.20; for more details and guidance: https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations
  eroded_mask = binary_erosion(out_mask.numpy().astype(np.bool), np.ones([3, 3, 3]).astype(np.bool))
Traceback (most recent call last):
  File "fit-grid.py", line 224, in <module>
    main()
  File "fit-grid.py", line 204, in main
    gradient_direction='ascent')
  File "/home/user/miniconda3/envs/neural-splines/lib/python3.7/site-packages/skimage/measure/_marching_cubes_lewiner.py", line 137, in marching_cubes
    mask=mask)
  File "/home/user/miniconda3/envs/neural-splines/lib/python3.7/site-packages/skimage/measure/_marching_cubes_lewiner.py", line 302, in _marching_cubes_lewiner
    raise ValueError("Surface level must be within volume data range.")
ValueError: Surface level must be within volume data range.
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 512/512 [00:00<00:00, 1791.14it/s, Cell=(7, 7, 7), Num Points=0]

einstein.zip

fwilliams commented 3 years ago

Thanks for reporting this!

So the issue here is that we first downsample your input point cloud on the voxel grid that we will reconstruct on, which reduces the effective size of the point cloud being reconstructed. Here you passed 128 for the longest dimension of this grid, so we'll average all the points within each cell into a "super point" which we use for reconstruction. You can see in the log you posted that this reduces the input point cloud to have only 7549 points.

As a fix, you can try reconstructing on a larger grid (e.g. size 512).

I do think this is a bit confusing so I'll leave the issue open so that we filter based on the size of the input point cloud and not the downsampled one.