VinAIResearch / ISBNet

ISBNet: a 3D Point Cloud Instance Segmentation Network with Instance-aware Sampling and Box-aware Dynamic Convolution (CVPR 2023)
Apache License 2.0
104 stars 22 forks source link

There were some problems when training your own dataset #21

Closed buggogogo closed 1 year ago

buggogogo commented 1 year ago

Hello author! Thank you very much for your work, it has benefited me a lot. I changed my own data with reference to stpls3d. And modified the model-related files for training. But the one where the semantic tag starts (either started from 0 or 1) cannot be rendered in training.The problem is shown in the picture.Is there any way you can solve this problem?

微信图片_20230612194935

ngoductuanlhp commented 1 year ago

nan results mean that there is no point with the GT label as 'stem'. Maybe you set the instance_label of 'stem' points to -100 during preprocessing data. If both stem and leaf are instance classes, you should set semantic_classes=2 and instance_classes=2 in the config file.

huanghuang113 commented 1 year ago

Hello author! Thank you very much for your work, it has benefited me a lot. I changed my own data with reference to stpls3d. And modified the model-related files for training. But the one where the semantic tag starts (either started from 0 or 1) cannot be rendered in training.The problem is shown in the picture.Is there any way you can solve this problem?

微信图片_20230612194935

Hello sir, can you please share your code about processing your own dataset, I'm having some problems and need to preprocess my dataset but I'm experiencing some difficulties

huanghuang113 commented 12 months ago

Hello author! Thank you very much for your work, it has benefited me a lot. I changed my own data with reference to stpls3d. And modified the model-related files for training. But the one where the semantic tag starts (either started from 0 or 1) cannot be rendered in training.The problem is shown in the picture.Is there any way you can solve this problem?

微信图片_20230612194935

Hi, can you please show me how your config file is configured, I am currently learning about this, can you please show me how your config is configured?

wowluna commented 8 months ago

Hello author! Thank you very much for your work, it has benefited me a lot. I changed my own data with reference to stpls3d. And modified the model-related files for training. But the one where the semantic tag starts (either started from 0 or 1) cannot be rendered in training.The problem is shown in the picture.Is there any way you can solve this problem?

微信图片_20230612194935

Perhaps you have solved this problem? I met the same problem. image

wowluna commented 8 months ago

hi! I modified the following parameters of "instance_eval.py"and "ibsnet.py", I got an evaluation. image @image But I don't understand the parameter "gts_sem". Can you explain it for me? Thanks!! @ngoductuanlhp @buggogogo

ngoductuanlhp commented 8 months ago

Dear @wowluna , nan in evaluation means there is no gt instance of this class in the evaluation samples. Let me clarify these lines of code. In ScanNetV2, there are 20 categories for semantic segmentation and 18 categories for instance segmentation (we assume that the number of semantic classes is always greater or equal to the number of instance classes), thus the difference is 2. The transformation here is gts_sem = gts_sem + (num_semantic - num_instance) + 1. The purpose is to align the categories in groundtruth annotations with the predicted categories in our predictions.

wowluna commented 8 months ago

this class in the evaluation samples. Let me clarify these lines of code. In ScanNetV2, there are 20 categories for semantic segmentation and 18 categories for instance segmentation (we assume that the number of semantic classes is always greater or equal to the number of instance classes), thus the difference is 2. The transformation here is gts_sem = gts_sem + (num_semantic - num_instance) + 1. The purpose is to align the categories in groundtruth annotations with the predicted categories in our predictions.

Thanks!I get it now.