Shengcao-Cao / CMT

[CVPR 2023] CMT: Contrastive Mean Teacher for Domain Adaptive Object Detectors
Apache License 2.0
37 stars 6 forks source link

Removing Normalization during preprocessing leads to model failure (gives mAP 0) #4

Closed Manjuphoenix closed 1 year ago

Manjuphoenix commented 1 year ago

I'm training the code on a different dataset and since the dataset distribution is different from the used setting i wanted to try a different normalization. While doing this experiment removing the normalization from the rcnn.py file (line: https://github.com/Shengcao-Cao/CMT/blob/2965f3c977413e5b942aa4590838d781135d1ed7/CMT_AT/adapteacher/modeling/meta_arch/rcnn.py#L133C1-L133C74) gives 0 mAP as seen in the below image

Is there any particular reason for the model collapse when normalization is removed?

The same code with all setting same and having normalization is giving decent results but has some non zero value unlike this setting where normalization is removed

Screenshot from 2023-09-08 03-26-10

Shengcao-Cao commented 1 year ago

Hello Manjuphoenix,

Since we are starting from a pre-trained model (VGG or ResNet) and you plan to fine-tune it on your own dataset, I would suggest using the same normalization that was used when pre-training the model. If you don't use the same normalization, the pre-trained weights might not be as effective. As a result, your final performance may be close to zero as you have shown.

Manjuphoenix commented 1 year ago

That was true but the main reason i was getting 0 mAP even after 10K iterations was because, when i comment the code for normalization the image pixel values were in the range of [0,255] but the model was performing well only when the pixel values were normalized i.e. pixel value range is [0,1]. When I removed normalization the pixel values were ranging between 0 to 255 and to get them in range [0,1] i divided the image by 255. After passing this the model gave some number which was very low unlike the previous experiment where it was 0. And the drop in performance is due to the reason you've mentioned