Zzh-tju / DIoU-SSD-pytorch

Distance-IoU Loss: Faster and Better Learning for Bounding Box Regression (AAAI 2020)
GNU General Public License v3.0
87 stars 24 forks source link

don't work well with FCOS #11

Closed Libaishun closed 4 years ago

Libaishun commented 4 years ago

I replace the gious loss in FCOS by dious loss, found no improvement and also dious loss converge slower than gious loss.

Zzh-tju commented 4 years ago

how about ciou loss @Libaishun

Libaishun commented 4 years ago

I didn't try, since FCOS work with real w and h rather than normalized ones as in computing ciou loss, how to change your code to suit the case?

Zzh-tju commented 4 years ago

Aha, this is a specific requirement. Maybe you have to achieve it by yourself. I'm also looking forward to seeing its results on fcos. I guess since the convergence of DIoU is slow and the variables are all real sizes, this will really hinder the convergence efficiency, because the denominator is very large. These loss functions are bounded. The larger the object, the smaller the gradient. So (0,1) variables are often used because they are good for optimization.

Libaishun commented 4 years ago

I didn't normalize the bbox coordinates from the first place, instead just change the one line of code: ar = (8 / (math.pi ** 2)) * arctan * ((w1 - w_temp) * h1 / nw / nh)
where nw=image width and nh=image height I don't know if this is correct, but the ciou loss do converge now, after training for ten epochs on coco dataset, ciou loss decrease to 0.30, and mAP=0.211, while with the same model architecture and training pipeline, diou loss can decrease to 0.26 with mAP=0.212; for comparison to the original giou loss, it can decrease to 0.26 with mAP=0.215

quantumsquirrel commented 3 years ago

I didn't normalize the bbox coordinates from the first place, instead just change the one line of code: ar = (8 / (math.pi ** 2)) * arctan * ((w1 - w_temp) * h1 / nw / nh) where nw=image width and nh=image height I don't know if this is correct, but the ciou loss do converge now, after training for ten epochs on coco dataset, ciou loss decrease to 0.30, and mAP=0.211, while with the same model architecture and training pipeline, diou loss can decrease to 0.26 with mAP=0.212; for comparison to the original giou loss, it can decrease to 0.26 with mAP=0.215

Could you please tell me how can I get the nw and nh durining calculating IOULoss? I was also trying to implementing CIoU in FCOS.

Zzh-tju commented 3 years ago

@quantumsquirrel https://github.com/Zzh-tju/CIoU/blob/master/layers/modules/multibox_loss.py#L11 you need to transform t,b,l,r to x,y,w,h, and then use CIoU loss.