Open bishoyroufael opened 2 years ago
What point cloud are you using? There seems to be something wrong with the data. The point cloud only has 8 blocks which is low for 10 bits. The first block has 2135699 points which should be impossible for a block size of 64 (max should be 646464=262144).
This line
IndexError: index 2147483648 is out of bounds for axis 2 with size 64
may indicate some overflow/underflow problem as it is equal to 2^31 which is the largest signed 32-bit integer + 1.
Hi,
I ran it for Stanford_Bunny.ply and chair_0894.ply
I am getting the same error:
2023-03-07 23:05:38.073 INFO pc_io - load_points: Loading PCs into memory (parallel reading)
100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 1/1 [00:00<00:00, 1.29it/s]
2023-03-07 23:05:39.044 INFO compress_octree - compress: Performing octree partitioning
2023-03-07 23:05:39.187 INFO compress_octree - compress: Processing resolution 32 with octree level 1 resulting in dense_tensor_shape [ 1 16 16 16] and 282 blocks
2023-03-07 23:05:40.291 WARNING deprecation - new_func: From /mnt/tank/bem-nr/pcc_geo/lib/python3.6/site-packages/tensorflow_core/python/ops/resource_variable_ops.py:1630: calling BaseResourceVariable.init (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.
Instructions for updating:
If using Keras pass *_constraint arguments to layers.
2023-03-07 23:05:41.575 INFO compress_octree - compress: Init session
2023-03-07 23:05:42.323 INFO saver - restore: Restoring parameters from /mnt/tank/bem-nr/pcc_geo_cnn_v2/pcc_geo_cnn_v2/models/c3p-a0.5/1.00e-04/model.ckpt-23500
0%| | 0/1 [00:00<?, ?it/s]2023-03-07 23:05:42.427 INFO compress_octree - compress: Starting /mnt/tank/bem-nr/datasets/pointclouds/fraunhofer/Stanford_Bunny.ply to /mnt/tank/bem-nr/datasets/pointclouds/fraunhofer/experiments/pcc_geo_cnn_v2/Stanford_Bunny/c4-ws/1.00e-04/Stanford_Bunny_d1.ply.bin, /mnt/tank/bem-nr/datasets/pointclouds/fraunhofer/experiments/pcc_geo_cnn_v2/Stanford_Bunny/c4-ws/1.00e-04/Stanford_Bunny_d2.ply.bin with 282 blocks
2023-03-07 23:05:42.429 INFO model_types - compress_blocks: Compress block 0/282: start
2023-03-07 23:05:42.429 INFO model_types - compress_blocks: Compress block: run session
2023-03-07 23:05:44.501 INFO model_types - compress_blocks: Compress block: session done
2023-03-07 23:05:44.503 INFO model_types - compress_blocks: Compress block: compute optimal thresholds
2023-03-07 23:05:46.145 INFO model_opt - compute_optimal_thresholds: Processing max_deltas [inf] on block with 155 points
.
.
.
.
.
177/256 thresholds eligible for max_delta inf, d1_mse 95 105/69 points (ratio 1.52) 1.04e+00 < mean point metric 2.31e+01, d2_mse 113 84/69 points (ratio 1.22) 1.21e+00 < mean point metric 7.50e+00
2023-03-07 23:06:02.570 INFO model_types - compress_blocks: Compress block: done
2023-03-07 23:06:02.570 INFO model_types - compress_blocks: Compress block 31/282: start
2023-03-07 23:06:02.570 INFO model_types - compress_blocks: Compress block: run session
2023-03-07 23:06:02.588 INFO model_types - compress_blocks: Compress block: session done
2023-03-07 23:06:02.588 INFO model_types - compress_blocks: Compress block: compute optimal thresholds
2023-03-07 23:06:03.386 INFO model_opt - compute_optimal_thresholds: Processing max_deltas [inf] on block with 160 points
159/256 thresholds eligible for max_delta inf, d1_mse 110 195/160 points (ratio 1.22) 1.30e+00 < mean point metric 4.19e+01, d2_mse 140 24/160 points (ratio 0.15) 1.96e+00 < mean point metric 6.87e+00
2023-03-07 23:06:03.387 INFO model_types - compress_blocks: Compress block: done
2023-03-07 23:06:03.388 INFO model_types - compress_blocks: Compress block 32/282: start
2023-03-07 23:06:03.388 INFO model_types - compress_blocks: Compress block: run session
2023-03-07 23:06:03.403 INFO model_types - compress_blocks: Compress block: session done
2023-03-07 23:06:03.403 INFO model_types - compress_blocks: Compress block: compute optimal thresholds
2023-03-07 23:06:03.830 INFO model_opt - compute_optimal_thresholds: Processing max_deltas [inf] on block with 13 points
124/256 thresholds eligible for max_delta inf, d1_mse 91 19/13 points (ratio 1.46) 1.19e+00 < mean point metric 4.92e+00, d2_mse 255 0/13, metric 1.47e+00 > mean point metric 8.23e-01
2023-03-07 23:06:03.830 INFO model_types - compress_blocks: Compress block: done
2023-03-07 23:06:03.831 INFO model_types - compress_blocks: Compress block 33/282: start
Traceback (most recent call last):
File "compress_octree.py", line 185, in
can you please let me know why this is happening?
This seems to be due to the point cloud quantification, the input sample provided seems to only be quantified as positive integer. I tried with floating point values positions (positive and negative) and got the same issues, re quantizing to integers seems to solve this issue (though it is not clearly mentioned)
Got the following when trying to compress ply file
Any ideas what could be the cause?