QianyiWu / objsdf

:t-rex: [ECCV‘22] Pytorch implementation of 'Object-Compositional Neural Implicit Surfaces'
183 stars 5 forks source link

About prepocess on ScanNet dataset #4

Open Zhouky1 opened 1 year ago

Zhouky1 commented 1 year ago

Hi! Congratulations on your wonderful work! I meet some problem with the experiment on scannet dataset. I see the code needs "label_mapping_instance.txt", and I get it from "scenexxxx_00.aggregation.json" as "0, 1, 2, .....,9 ", but there are still errors when running code, as "nll_loss_forward_reduce_cuda_kernel_2d: block: [0,0,0], thread: [28,0,0] Assertion t >= 0 && t < n_classes failed." It seems that the semantic (ground truth) has something wrong. I wonder what is wrong with it. Thank you in advance!

QianyiWu commented 1 year ago

Hi Zhouky1,

Thanks for your interest in our paper. I get the label mapping file by calculating all instances from the current scene. You can refer to here for some code snippets you might need.

But the preprocessing part of ScanNet is still not very clean, and I am currently working on an improved version of ObjectSDF. I will try to release a more clean version of the code for dealing with the ScanNet dataset in that version.

Zhouky1 commented 1 year ago

Thanks for your reply! This really helps, for it differs from the result gotten from "aggregation.json" But here comes another problem. For example, I get the label as [0, 1, ,... 8] in scene0192_00, and I change the "model.d_out" in config as 9, but all three losses change to NAN after calculating 60th~80th samples even in the first epoch. The problem still exists if I change the "ignore_index = 8" when calculating cross entropy in loss. ( I try this because aggregation.json only contains label [0, 1, ,..., 7]) Is there any other parameters I need to change? Or can you kindly send me the config and checkpoints of ScanNet experiments? I have already gotten the authoritzation from the author of ScanNet. Thank you in advance.

jieBaoa commented 1 year ago

Hi Zhouky1, have you solved this problem?