val-iisc / densepcr

Repository for 'Dense 3D Point Cloud Reconstruction Using a Deep Pyramid Network' [WACV 2019]
51 stars 14 forks source link

Could u share trained model? #3

Open xiaomingjie opened 5 years ago

xiaomingjie commented 5 years ago

Hi @priyankamandikal I tried to train your densepcr on my own but I found it difficult to train. Especially while I was training part dense_2 (4k_to_16k), it was so hard to converge and the visualized results were awful. But
dense_1 converged pretty well. Besides, even with awful visualized results, finally after finetuning, the CD and EMD showed pretty good just as your paper shows... So I m wondering what if u could share ur trained model so that I can get the results like your paper demonstrated. Thank u very much

BGHB commented 5 years ago

@xiaomingjie I I'm a newcomer in this field. I download and learn this code also, Can you share your train dataSet or tell me who to get the dataSet? please

xiaomingjie commented 5 years ago

sry.. I use my own dataset that renders images and samples pointclouds from Shapenet Dataset. I suggest that u can dig into Shapenet dataset and see how to deal with it.

BGHB commented 5 years ago

@xiaomingjie thanks for your answer. I'm try to run the "train_base.py", but I miss the "tf_nndistance_so.so". how to get the file?

xiaomingjie commented 5 years ago

@BGHB You may follow PSGN to compile it from .cpp

BGHB commented 5 years ago

@xiaomingjie Do you have the ShapeNet DataSet? Can you share it with me?

luotuoqingshan commented 5 years ago

Hi @xiaomingjie , I used jackd/shapenet tools to generate pointcloud and RGB data from original shapenet. I would like to use them in PSGN because PSGN didn't provide a dataset with categories. I finished dealing with the data and get the pointcloud and RGB data. But I found the scale of the pointcloud is different with the dataset provided by PSGN. I didn't have much disk memory so I couldn't retrain PSGN with my own data. I just would like to do some testing. I am quite frustrated about complex data preparation. Would you like to let me know how to deal with this scale problem?

xiaomingjie commented 5 years ago

HI @luotuoqingshan , As far as I know, training data of PSGN is much more larger than Shapenet dataset. Jack renders and samples each single model of Shapenet dataset using his own method. Each RGB should be about 10k and each pointcloud should be about 200k. There are no more than 50k models so it should be about 10g in total. So is there something wrong when you generate your own data?

luotuoqingshan commented 5 years ago

Hi @xiaomingjie, I really appreciate your quick reply. Yes, training data of PSGN is very large. I get all pointcloud data(16384 points a model) by Jack's method and it take up 17G. With mesh data(to sample point cloud) and origin shapenet data I have spent 70G. Because rendering seems slow so I only render part of them. It really took a lot of space concerning my computation resources. I saw you opened an issue in PSGN. Did you retrain PSGN with your own data? Did the model has good performance? My purpose is to do some testing with PSGN and the results should be divided into categories. My naive idea is to count the average scale of the pointcloud of PSGN's training data and my own data and do a simple rescale. Do you think this will work?

xiaomingjie commented 5 years ago

@luotuoqingshan , I retrained PSGN with my own data. But due to the differences of data distribution and normalization, I didn't get the same CD numbers or EMD numbers as the paper shows. Actually many papers in point cloud reconstruction get different result because of the differences of data distribution and normalization. So if you'd like to do some testing with PSGN, I suggest that use your own data to retrain PSGN and get your results and forget about its original results.

luotuoqingshan commented 5 years ago

@xiaomingjie , thanks a lot for your suggestions. Sometimes we need to adjust the coefficients of different losses, such as in CD the coefficients between dis_forward and dis_backward. Do you know how to choose the coefficients by binary search?

xiaomingjie commented 5 years ago

@luotuoqingshan , I regard that parameters as hyper-parameters. As far as I know, only PSGN changes the hyper-parameter between forward and backward. Most works directly define the loss as the formulation of CD. I suppose that it's not that important.

luotuoqingshan commented 5 years ago

Hi @xiaomingjie I try to reproduce PSGN using jackd's dataset utils. But I get awful visualization results but pretty good CD results. I see you had same problems. Do you have some suggestions?

xiaomingjie commented 5 years ago

@luotuoqingshan I can only come up with two thoughts.

  1. I forgot to normalize my input images to 0.~1. when I was doing predicting.
  2. Maybe your network had not converged and you stopped training.
luotuoqingshan commented 5 years ago

@xiaomingjie , thanks a lot for your reply. How about the speed of convergence? Did you add BN? Do you have an unofficial implementation?

xiaomingjie commented 5 years ago

@luotuoqingshan The speed depends on your training data if you did not modify PSGN. The official implementation works for me. And if you did some modification on PSGN then I cannot tell why you got such predictions.

luotuoqingshan commented 5 years ago

@xiaomingjie , thank you. Let me have a try.

yxyyz commented 4 years ago

Hi @xiaomingjie Can you share your guideline that renders images and samples pointclouds from Shapenet Dataset.

D0ub1ePieR commented 4 years ago

@xiaomingjie Could you share your trained model on this work?I want to calculate the Chamfer Distance with my own code and compare to my recent work.

RenInsist commented 2 years ago

@xiaomingjie @BGHB ,我想复现一下DensePCR这个论文,可以指导一下吗,加您个联系方式欧克吗?

BGHB commented 2 years ago

@RenInsist 已弃

poloinh commented 1 year ago

Hi,@xiaomingjie I'm a newcomer in this field. I downloaded this code. I have this data set, but I found no test. py file. Can you share it ?