mauriceqch / pcc_geo_cnn_v2

Improved Deep Point Cloud Geometry Compression
MIT License
66 stars 8 forks source link

Compression IndexError out of bounds for axis 2 with size 64 #13

Open bishoyroufael opened 2 years ago

bishoyroufael commented 2 years ago

Got the following when trying to compress ply file

2022-01-19 02:03:19.832 INFO pc_io - load_points: Loading PCs into memory (parallel reading)
100% 1/1 [01:54<00:00, 114.41s/it]
2022-01-19 02:05:14.584 INFO compress_octree - compress: Performing octree partitioning
2022-01-19 02:06:51.474 INFO compress_octree - compress: Processing resolution 1024 with octree level 4 resulting in dense_tensor_shape [ 1 64 64 64] and 8 blocks
2022-01-19 02:06:52.203 INFO utils - _init_num_threads: NumExpr defaulting to 2 threads.
2022-01-19 02:06:53.138 WARNING deprecation - new_func: From /usr/local/lib/python3.7/dist-packages/tensorflow_core/python/ops/resource_variable_ops.py:1630: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.
Instructions for updating:
If using Keras pass *_constraint arguments to layers.
2022-01-19 02:06:55.970 INFO compress_octree - compress: Init session
2022-01-19 02:06:58.670 INFO saver - restore: Restoring parameters from /content/pcc_geo_cnn_v2/pcc_geo_cnn_v2/models/c4-ws/1.00e-04/model.ckpt-26000
  0% 0/1 [00:00<?, ?it/s]2022-01-19 02:06:58.810 INFO compress_octree - compress: Starting /content/faro_converted_ascii.ply to /content/faro_converted_ascii.comp.bin with 8 blocks
2022-01-19 02:06:58.811 INFO model_types - compress_blocks: Compress block 0/8: start
2022-01-19 02:06:58.854 INFO model_types - compress_blocks: Compress block: run session
2022-01-19 02:07:02.296 INFO model_types - compress_blocks: Compress block: session done
2022-01-19 02:07:02.297 INFO model_types - compress_blocks: Compress block: compute optimal thresholds
2022-01-19 02:10:09.271 INFO model_opt - compute_optimal_thresholds: Processing max_deltas [inf] on block with 2135699 points
191/256 thresholds eligible for max_delta inf, d1_mse 133 49/2135699 points (ratio 0.00) 3.32e-01 < mean point metric 2.70e+00
2022-01-19 02:10:09.272 INFO model_types - compress_blocks: Compress block: done
2022-01-19 02:10:09.273 INFO model_types - compress_blocks: Compress block 1/8: start
Traceback (most recent call last):
  File "compress_octree.py", line 185, in <module>
    compress()
  File "compress_octree.py", line 106, in compress
    fixed_threshold=args.fixed_threshold, debug=args.debug)
  File "/content/pcc_geo_cnn_v2/src/model_types.py", line 195, in compress_blocks
    x_val = sparse_to_dense(block_uint32, self.x.shape, self.data_format)
  File "/content/pcc_geo_cnn_v2/src/model_types.py", line 111, in sparse_to_dense
    x_val[0, 0, block[:, 0], block[:, 1], block[:, 2]] = 1.0
IndexError: index 2147483648 is out of bounds for axis 2 with size 64
  0% 0/1 [03:10<?, ?it/s]

Any ideas what could be the cause?

mauriceqch commented 2 years ago

What point cloud are you using? There seems to be something wrong with the data. The point cloud only has 8 blocks which is low for 10 bits. The first block has 2135699 points which should be impossible for a block size of 64 (max should be 646464=262144).

This line IndexError: index 2147483648 is out of bounds for axis 2 with size 64 may indicate some overflow/underflow problem as it is equal to 2^31 which is the largest signed 32-bit integer + 1.

NiranjanRavi05 commented 1 year ago

Hi,

I ran it for Stanford_Bunny.ply and chair_0894.ply

I am getting the same error:

2023-03-07 23:05:38.073 INFO pc_io - load_points: Loading PCs into memory (parallel reading) 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 1.29it/s] 2023-03-07 23:05:39.044 INFO compress_octree - compress: Performing octree partitioning 2023-03-07 23:05:39.187 INFO compress_octree - compress: Processing resolution 32 with octree level 1 resulting in dense_tensor_shape [ 1 16 16 16] and 282 blocks 2023-03-07 23:05:40.291 WARNING deprecation - new_func: From /mnt/tank/bem-nr/pcc_geo/lib/python3.6/site-packages/tensorflow_core/python/ops/resource_variable_ops.py:1630: calling BaseResourceVariable.init (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version. Instructions for updating: If using Keras pass *_constraint arguments to layers. 2023-03-07 23:05:41.575 INFO compress_octree - compress: Init session 2023-03-07 23:05:42.323 INFO saver - restore: Restoring parameters from /mnt/tank/bem-nr/pcc_geo_cnn_v2/pcc_geo_cnn_v2/models/c3p-a0.5/1.00e-04/model.ckpt-23500 0%| | 0/1 [00:00<?, ?it/s]2023-03-07 23:05:42.427 INFO compress_octree - compress: Starting /mnt/tank/bem-nr/datasets/pointclouds/fraunhofer/Stanford_Bunny.ply to /mnt/tank/bem-nr/datasets/pointclouds/fraunhofer/experiments/pcc_geo_cnn_v2/Stanford_Bunny/c4-ws/1.00e-04/Stanford_Bunny_d1.ply.bin, /mnt/tank/bem-nr/datasets/pointclouds/fraunhofer/experiments/pcc_geo_cnn_v2/Stanford_Bunny/c4-ws/1.00e-04/Stanford_Bunny_d2.ply.bin with 282 blocks 2023-03-07 23:05:42.429 INFO model_types - compress_blocks: Compress block 0/282: start 2023-03-07 23:05:42.429 INFO model_types - compress_blocks: Compress block: run session 2023-03-07 23:05:44.501 INFO model_types - compress_blocks: Compress block: session done 2023-03-07 23:05:44.503 INFO model_types - compress_blocks: Compress block: compute optimal thresholds 2023-03-07 23:05:46.145 INFO model_opt - compute_optimal_thresholds: Processing max_deltas [inf] on block with 155 points . . . . . 177/256 thresholds eligible for max_delta inf, d1_mse 95 105/69 points (ratio 1.52) 1.04e+00 < mean point metric 2.31e+01, d2_mse 113 84/69 points (ratio 1.22) 1.21e+00 < mean point metric 7.50e+00 2023-03-07 23:06:02.570 INFO model_types - compress_blocks: Compress block: done 2023-03-07 23:06:02.570 INFO model_types - compress_blocks: Compress block 31/282: start 2023-03-07 23:06:02.570 INFO model_types - compress_blocks: Compress block: run session 2023-03-07 23:06:02.588 INFO model_types - compress_blocks: Compress block: session done 2023-03-07 23:06:02.588 INFO model_types - compress_blocks: Compress block: compute optimal thresholds 2023-03-07 23:06:03.386 INFO model_opt - compute_optimal_thresholds: Processing max_deltas [inf] on block with 160 points 159/256 thresholds eligible for max_delta inf, d1_mse 110 195/160 points (ratio 1.22) 1.30e+00 < mean point metric 4.19e+01, d2_mse 140 24/160 points (ratio 0.15) 1.96e+00 < mean point metric 6.87e+00 2023-03-07 23:06:03.387 INFO model_types - compress_blocks: Compress block: done 2023-03-07 23:06:03.388 INFO model_types - compress_blocks: Compress block 32/282: start 2023-03-07 23:06:03.388 INFO model_types - compress_blocks: Compress block: run session 2023-03-07 23:06:03.403 INFO model_types - compress_blocks: Compress block: session done 2023-03-07 23:06:03.403 INFO model_types - compress_blocks: Compress block: compute optimal thresholds 2023-03-07 23:06:03.830 INFO model_opt - compute_optimal_thresholds: Processing max_deltas [inf] on block with 13 points 124/256 thresholds eligible for max_delta inf, d1_mse 91 19/13 points (ratio 1.46) 1.19e+00 < mean point metric 4.92e+00, d2_mse 255 0/13, metric 1.47e+00 > mean point metric 8.23e-01 2023-03-07 23:06:03.830 INFO model_types - compress_blocks: Compress block: done 2023-03-07 23:06:03.831 INFO model_types - compress_blocks: Compress block 33/282: start Traceback (most recent call last): File "compress_octree.py", line 185, in compress() File "compress_octree.py", line 106, in compress fixed_threshold=args.fixed_threshold, debug=args.debug) File "/mnt/tank/bem-nr/pcc_geo_cnn_v2/src/model_types.py", line 195, in compress_blocks x_val = sparse_to_dense(block_uint32, self.x.shape, self.data_format) File "/mnt/tank/bem-nr/pcc_geo_cnn_v2/src/model_types.py", line 111, in sparse_to_dense x_val[0, 0, block[:, 0], block[:, 1], block[:, 2]] = 1.0 IndexError: index 2147483648 is out of bounds for axis 2 with size 16

can you please let me know why this is happening?

ArmandZampieri commented 5 months ago

This seems to be due to the point cloud quantification, the input sample provided seems to only be quantified as positive integer. I tried with floating point values positions (positive and negative) and got the same issues, re quantizing to integers seems to solve this issue (though it is not clearly mentioned)