Closed jjlinghu closed 6 months ago
Thanks to the authors for releasing the code quickly! I have retested the code and completely repeated the performance that the author declares in the paper.
There are a few things to note:
When I load the pre_trained model
checkpoint_best.pth
, https://github.com/ZhanYang-nwpu/Mono3DVG/blob/da207e5bf216b707370a5175133e11e6614c4d30/lib/helpers/save_helper.py#L80I miss the _RuntimeError: Unexpected key(s) in state_dict: "text_encoder.embeddings.positionids". By carefully comparing the model and the weight parameters, I didn't see any difference. This phenomenon makes me confused. Anyway, I solved this issue by setting
strict=False
and it can't decline the performance.- Some
np.float
, andnp.bool
need to be changed tonp.float64
, andnp.bool_
if the version of NumPy > 1.20- It seems that not only MonoRefer dataset is used, we may utilize the raw
KITTI/training/calib
by soft link. https://github.com/ZhanYang-nwpu/Mono3DVG/blob/da207e5bf216b707370a5175133e11e6614c4d30/lib/datasets/mono3drefer/mono3drefer_dataset.py#L68- The file path may need to be modified appropriately.
Thanks to the authors for their contributions to the community!
I have a lot of things going on recently so I'm in a hurry to organize the code. Thank you very much for reminding me. I will check the code again.
It seems that the Pre-trained model (MonoDETR) is destroyed when uploads.
It seems that the Pre-trained model (MonoDETR) is destroyed when uploads.
Yes, it could be because of the network. I'll fix this right now.
It seems that the Pre-trained model (MonoDETR) is destroyed when uploads.
This issue has been resolved.
Thanks to the authors for releasing the code quickly! I have retested the code and completely repeated the performance that the author declares in the paper.
There are a few things to note:
checkpoint_best.pth
, https://github.com/ZhanYang-nwpu/Mono3DVG/blob/da207e5bf216b707370a5175133e11e6614c4d30/lib/helpers/save_helper.py#L80 I miss the _RuntimeError: Unexpected key(s) in state_dict: "text_encoder.embeddings.positionids". By carefully comparing the model and the weight parameters, I didn't see any difference. This phenomenon makes me confused. Anyway, I solved this issue by settingstrict=False
and it can't decline the performance.np.float
, andnp.bool
need to be changed tonp.float64
, andnp.bool_
if the version of NumPy > 1.20KITTI/training/calib
by soft link. https://github.com/ZhanYang-nwpu/Mono3DVG/blob/da207e5bf216b707370a5175133e11e6614c4d30/lib/datasets/mono3drefer/mono3drefer_dataset.py#L68Thanks to the authors for their contributions to the community!