Open CBY-9527 opened 1 year ago
I found that the number of pseudo-labels generated by pvrcnn_dts gradually decreases to 0 as the number of iteration updates increases
It seems like the model collapse into a trivial state. How about reduce the inter_graph_loss_weight?
I would like to ask what may be the reason for the low performance, is there a problem with the code or configuration file?
There may be some problems with the configuration file, I'll check it in my spare time.
I found that the number of pseudo-labels generated by pvrcnn_dts gradually decreases to 0 as the number of iteration updates increases
It seems like the model collapse into a trivial state. How about reduce the inter_graph_loss_weight?
Thank you for your reply. I tried reducing the inter_graph_loss_weight to 1/10 before, but the self-training performance was even worse.
Excuse me. I have encountered some problems. The performance of the pvrcnn_dts I reproduced (nus->KITTI) is only 82.61/66.86 (Mod. BEV/3D AP), which is a bit far from the 83.9/71.8 in the paper. The model trained in the source domain should be fine (the model trained in the source domain is used to directly test KITTI, and the performance reaches 77.26/63.43). I would like to ask what may be the reason for the low performance, is there a problem with the code or configuration file? In addition, I found that the number of pseudo-labels generated by pvrcnn_dts gradually decreases to 0 as the number of iteration updates increases. This seems to be an abnormal phenomenon. Have you encountered it in previous experiments?