Closed ichlubna closed 3 weeks ago
Hi @ichlubna,
Seeing the compression/decompression time, I would guess that nothing is being compressed/decompressed there. So couple of questions:
models/
folder that you pointed to and does the model have the same name as the experiment (i.e. is this os.path.join(args.model_path, args.experiment)
correctly pointing to the weights) ?pc_blocks
defined here https://github.com/mmspg/pcc-geo-slicing/blob/131d573e9b68b2896f74c1ff48913c4dba2492fb/point_cloud_compression_slice_conditioning.py#L631C9-L631C72 is not empty at runtime ?Thank you for your answer @WonderPG.
I have downloaded two weights from the FTP server and placed them into the models directory. This is my directory content:
├── models
│ ├── a0.6_res_t64_Slice_cond_160-10_40
│ │ ├── assets
│ │ ├── keras_metadata.pb
│ │ ├── saved_model.pb
│ │ └── variables
│ │ ├── variables.data-00000-of-00001
│ │ └── variables.index
│ └── a0.6_res_t64_Slice_cond_160-10_400
│ ├── keras_metadata.pb
│ ├── saved_model.pb
│ └── variables
│ ├── variables.data-00000-of-00001
│ └── variables.index
├── point_cloud_compression_slice_conditioning.py
├── README.md
├── requirements.txt
├── src
│ ├── compression_utilities.py
│ ├── evaluate.py
│ ├── focal_loss.py
│ ├── __init__.py
│ ├── partition.py
│ ├── pc_io.py
│ ├── processing.py
│ └── __pycache__
│ ├── compression_utilities.cpython-311.pyc
│ ├── evaluate.cpython-311.pyc
│ ├── focal_loss.cpython-311.pyc
│ ├── __init__.cpython-311.pyc
│ ├── partition.cpython-311.pyc
│ ├── pc_io.cpython-311.pyc
│ └── processing.cpython-311.pyc
├── test
│ └── input.ply
└── testOut
└── a0.6_res_t64_Slice_cond_160-10_40
└── input.bin
Running command:
python point_cloud_compression_slice_conditioning.py --experiment a0.6_res_t64_Slice_cond_160-10_40 --model_path models/ compress --adaptive --resolution 64 --input_glob 'test/*.ply' --output_dir testOut
os.path.join(args.model_path, args.experiment)
seems to be OK:
models/a0.6_res_t64_Slice_cond_160-10_40
Oh you are right about the pc_blocks
, it is empty! Added:
print(pc_blocks)
print(len(pc_blocks))
And got:
[]
0
Great, now I am not sure exactly why you get no blocks here. My initial guess would be that in partition_pc
, the variable steps
defined here https://github.com/mmspg/pcc-geo-slicing/blob/131d573e9b68b2896f74c1ff48913c4dba2492fb/src/partition.py#L23 is 0, meaning that in your point cloud the resolution is smaller than the block size. This would mean that your point cloud is too small for the resolution you specify.
If it is the case, I suggest trying to compress with a lower resolution to see if it works.
You are right! I have scaled up the model and the compression seems to work. The steps are 4 and the compression time is higher:
python point_cloud_compression_slice_conditioning.py --experiment a0.6_res_t64_Slice_cond_160-10_40 --model_path models/ compress --adaptive --resolution 64 --input_glob 'test/*.ply' --output_dir testOut
Compressed 9/9 blocks of input.ply: 100%|███████████████████████████████████████████████████████████████████████████████| 1/1 [00:04<00:00, 4.89s/it]
Done. Total compression time: 4.893083333969116s
I also had to add a color attribute to the input point cloud to make it work.
Decompression:
python point_cloud_compression_slice_conditioning.py --experiment a0.6_res_t64_Slice_cond_160-10_40 --model_path models/ decompress --input_dir testOut/ --output_dir testDec/
Decompressed 9/9 blocks of input.bin: 100%|█████████████████████████████████████████████████████████████████████████████| 1/1 [00:02<00:00, 2.41s/it]
Done. Total decompression time: 2.405393123626709s
But the decompressed file seems to be empty:
It seems like the resolution problem is solved but this might be something else?
I am not exactly sure what I am looking at there, but if the file seems empty, check if the variable block_pa
has a non trivial occupancy map after this line https://github.com/mmspg/pcc-geo-slicing/blob/131d573e9b68b2896f74c1ff48913c4dba2492fb/point_cloud_compression_slice_conditioning.py#L728. Verify it for every block. This is the occupancy map that is saved in the file and that should correspond to the decompressed point cloud.
block_pa
is empty. Sorry for the trouble. Added:
print(block_pa)
block_pa += np.asarray(block_position) * resolution
print(block_pa)
Got:
[] [] Decompressed 1/9 blocks of input.bin: 0%| | 0/1 [00:00<?, ?it/s] [] [] Decompressed 2/9 blocks of input.bin: 0%| | 0/1 [00:01<?, ?it/s] [] [] Decompressed 3/9 blocks of input.bin: 0%| | 0/1 [00:01<?, ?it/s] [] [] Decompressed 4/9 blocks of input.bin: 0%| | 0/1 [00:01<?, ?it/s] [] [] Decompressed 5/9 blocks of input.bin: 0%| | 0/1 [00:01<?, ?it/s] [] [] Decompressed 6/9 blocks of input.bin: 0%| | 0/1 [00:01<?, ?it/s] [] [] Decompressed 7/9 blocks of input.bin: 0%| | 0/1 [00:01<?, ?it/s] [] [] Decompressed 8/9 blocks of input.bin: 0%| | 0/1 [00:02<?, ?it/s] [] [] Decompressed 9/9 blocks of input.bin: 0%| | 0/1 [00:02<?, ?it/s]
My point cloud is made in Blender:
Can you check the value of x_hat
then https://github.com/mmspg/pcc-geo-slicing/blob/131d573e9b68b2896f74c1ff48913c4dba2492fb/point_cloud_compression_slice_conditioning.py#L724 and the associated threshold threshold
?
Printing:
print(x_hat)
print(threshold)
returns:
[[[[0.13370346]
[0.0837783 ]
[0.07062638]
...
[0.07797834]
[0.08800992]
[0.10989405]]
[[0.08309832]
[0.05527447]
[0.05518826]
...
[0.05013118]
[0.05643999]
[0.08818358]]
[[0.07377354]
[0.04969506]
[0.04149766]
...
[0.04270149]
[0.04622087]
[0.0713947 ]]
...... similar values
[[0.08954909]
[0.05701619]
[0.03480629]
...
[0.03185652]
[0.05070778]
[0.07007699]]
[[0.11829787]
[0.07142194]
[0.05304877]
...
[0.04579277]
[0.06470928]
[0.0917687 ]]]]
1.0
I suggest you retry the pipeline without the --adaptive
argument and see if that works. If so, could you try again with --adaptive
but output every threshold
, i.e. for every iteration in the for loop output the threshold ?
Thank you sooo much for your patience and help. I now finally found out where the problem was. The model apparently has to be placed in the positive subspace. Negative coordinates cause problems. I tried with the other set of weights and got a few vertices. That's how I detected the problem after all. Everything works now! Thank you again!
My pleasure, it is good to know if anyone else has the same issue.
Good luck for your research !
Hello, I apparently cannot use your framework for the compression and decompression. I would like to cite your work in my research. Would you please help me with the correct usage? This is how I use it:
I get the result message that looks OK, there are some warnings though:
Let me know if you need all the warnings, I omitted some for brevity. The resulting decompressed file is not valid and the compressed one is always 2 bytes large, no matter what data I use as the input.
Attaching the files: files.zip