-
You know, in China, we can not access files on google drive. Can you share other download links to ModelNet40 and ScanObjectNN? For example, [Baidu NetDisk](https://pan.baidu.com/) has similar functio…
-
@MohamedAfham Recently, I have run all experiments in the codebase at least 3 times to ensure there are not explicit exceptions during my operations.
Some of the results are very encouraging, whi…
-
Hello, have you tried your model on the ScanobjectNN dataset for 3D object classification? If so, could you please provide the results of your experiment? Thank you for your time!
-
Hello,
Thank you for your great work.
I have a question. Can you please tell me how can I find the folder of "test_meshes"? To create "occlusion" and "lidar" corruptions, it is necessary in your …
-
Thanks so much for sharing your excellent work.
In train.py, line 41, 'full' partition is used. However, in the downloaded scanobjectnn dataset, I cannot find the corresponding '.h5' file.
-
Thansks for sharing the paper and dataset. Excellent work!
May I ask are there official train, val and test splits of the OmniObject3D dataset. I want to use it to train and evaluate 3D point cloud…
-
Hi,
I wonder to know that when you train and test on scanobjectnn if you use all 15 samples or you consider 11 sample like modelnet40
thanks
-
Sorry to bother,
I wonder if you plan to release the pretrained S2F or F2S model.
because the link to the pretrained model you released is incorrect or has some problem. I opened it and it lin…
-
Is this load_data function, as below, still working?
https://github.com/hkust-vgd/scanobjectnn/blob/fe60aeade9ceb8882bc3f1bc40612e65469d7e77/data_utils.py#L77
I could not find any `.pkl` file in t…
-
Thanks!It's clear for your answer. By the way, could you provide the hyper parameters for finetune pointmlp on modelnet40 and scanobjectnn? I tried your checkpoint to reproduce the fine tune results, …