nickgkan / butd_detr

Code for the ECCV22 paper "Bottom Up Top Down Detection Transformers for Language Grounding in Images and Point Clouds"
Other
74 stars 11 forks source link

detailed information of cls_results.json #37

Closed tony10101105 closed 11 months ago

tony10101105 commented 1 year ago

Hi,

In the previous issue you said that a pointnet++ classifier's classification results (like prior works) will be loaded if using argument but_cls. I wonder how did you exactly get the cls_results.json? Also, in another work NS3D it mentioned there are 607 classes in terms of object boxes of sr3d. Does cls_results.json contain all these 607 classes' classification results?

Thanks again for this great work!

nickgkan commented 1 year ago

We trained a pointnet++ on object classification on ScanNet. We used nyu ids, which merges some classes. As a result, we have 485 classes. Then, we run inference on all ground-truth boxes and keep the most confident prediction of our classifier.

While it's true that ScanNet has more classes (many of them don't even appear on the training set), the exact amount of classes we use is irrelevant to the rest of the pipeline. Our model simply sees a box and a predicted class name.

Lastly, prior works train a classifier as an auxiliary loss. We do not do that. We train a classifier separately and simply store the predictions.

tony10101105 commented 12 months ago

Hi,

Thanks for the reply! Could you provide this pointnet++'s weights and its hyper-parameters (ex: how to load it) if possible? I would like this object-level pointnet++ as a feature extractor in my own project. I noticed that the pretrained Group-Free pointnet++ you provided conducts feature upsampling (ex: B, 1024, 288), rather than downsampling (ex: B, 1, 288).

ayushjain1144 commented 11 months ago

Hi, sorry for the late reply, we could not find the checkpoint and the code but we just used this code base and modified a little bit to train for scannet classification: https://github.com/erikwijmans/Pointnet2_PyTorch/tree/master