yunhe20 / D-PCC

This is the official PyTorch implementation of our paper "Density-preserving Deep Point Cloud Compression" (CVPR 2022).
65 stars 9 forks source link

Question about prepare_semantickitti.py #3

Closed baibaibai-star closed 1 year ago

baibaibai-star commented 2 years ago

Hello! Thanks for your good job! I have tried to run prepare_semantickitti.py to generate KITTI train and val dataset,but the process always be killed during pickle.dump. I guess it's because out of memory. I would like to ask if you have encountered this situation,and how big your final semantickitti_train_cube_size_12.pkl file is?Thank you very much!

juc023 commented 2 years ago

I also experienced same problem!

baibaibai-star commented 2 years ago

I also experienced same problem!

Have you ever run prepare_shapenet.py to process ShapeNet dataset successfully? I also encountered bugs in this process, just like 'Not a JPEG file: starts with 0x89 0x50'.

juc023 commented 2 years ago

My solution takes some time. For semantickitti, I resolved it by making separate pickles, and then merged them later. For shapenet, I skipped the mesh files that have such an error.

baibaibai-star commented 2 years ago

Ok! Thank you very much!

Junwei-LV commented 1 month ago

My solution takes some time. For semantickitti, I resolved it by making separate pickles, and then merged them later. For shapenet, I skipped the mesh files that have such an error.

could you kindly describe how to skip the mesh files that have such error? thanks