XuyangBai / D3Feat

[TensorFlow] Official implementation of CVPR'20 oral paper - D3Feat: Joint Learning of Dense Detection and Description of 3D Local Features https://arxiv.org/abs/2003.03164
MIT License
261 stars 38 forks source link

A question about downsampling of ETH DataSet #36

Closed Hui-design closed 3 years ago

Hui-design commented 3 years ago

Hi, Thanks for your amazing work. I come across a problem and want to ask you for help. When you preprocessed the ETH dataset, you used a downsample_size of 0.0625. But unfortunately, when we tested it, our GUP (22945MiB) was out of memory. After debugging, we found that the number of points after 0.0625 sampling was too large. And we also found that the point cloud samples of different frames are The number of points is uneven. How did you solve it?

XuyangBai commented 3 years ago

Hi, the experiments on the ETH dataset were conducted on my 2080Ti GPU, so it should not OOM on your GPU. Did you use the ply provided by 3DSmoothNet? Could you double-check the data and the downsampling step?

Hui-design commented 3 years ago

Thank you very much for your reply. I have re-checked the data set and downsampling process, and I think there is nothing wrong. In fact, I did not run the tf version of the code, I just came here to take a look at downsample_size. The pytorch version of the code is what I am reading, but there is no test code for ETH there, so I modified and tested ETH based on "test.py" and "ThreeDMath.py",detaily for the scenelist and filepath, but OOM appeared. Did I miss other preprocessing steps?

XuyangBai commented 3 years ago

Could you leave an email address? I can send you the pytorch eth testing code.

Hui-design commented 3 years ago

2225705604@qq.cm Thank you very much !

Hui-design commented 3 years ago

The email should be 2225705604@qq.com Sorry, I sent the wrong email address. . . I'm sorry for bothering you so late, you can post it tomorrow