Banconxuan / RTM3D

The official PyTorch Implementation of RTM3D and KM3D for Monocular 3D Object Detection
MIT License
454 stars 85 forks source link

Drop parameter #32

Closed nightfuryyy closed 3 years ago

nightfuryyy commented 3 years ago

Thank you for awesome repo. I tried training KM3D model with resdcn_101 backbon. i used this

python ./src/main.py --data_dir ./kitti_format --exp_id KM3D_resdcn_101 --batch_size 10 --gpus 0 --num_epochs 200 --num_workers 10 --save_all --arch resdcn_101 --head_conv 64 --lr 0.0001  

Testing on validate dataset I found this warning

Drop parameter wh.0.weight.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter wh.0.bias.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter wh.2.weight.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter wh.2.bias.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter reg.0.weight.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter reg.0.bias.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter reg.2.weight.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter reg.2.bias.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter hm_hp.0.weight.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter hm_hp.0.bias.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter hm_hp.2.weight.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter hm_hp.2.bias.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter hp_offset.0.weight.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter hp_offset.0.bias.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter hp_offset.2.weight.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.
Drop parameter hp_offset.2.bias.If you see this, your model does not fully load the pre-trained weight. Please make sure you have correctly specified --arch xxx or set the correct --num_classes for your own dataset.

Btw the result is quite good. Any problem ? Thank you so much.

GUUUUG commented 3 years ago

Have you found a solution to the problem? I also encountered the same problem

gujiaqivadin commented 3 years ago

Have you solved the problem?