JJBOY / BMN-Boundary-Matching-Network

A pytorch-version implementation codes of paper: "BMN: Boundary-Matching Network for Temporal Action Proposal Generation", which is accepted in ICCV 2019.
MIT License
291 stars 64 forks source link

pem loss nan? #23

Open leemengxing opened 4 years ago

leemengxing commented 4 years ago

thks for your work. but I found that sometimes the batchsize is 1, pem = loss occurs. why?

leemengxing commented 4 years ago

There is a statistical ratio in the loss function. It may happen that the number of iou> 0.7 or iou> 0.9 is 0 for small batch sizes, such as 1, which leads to nan. Is this correct?

vhvkhoa commented 4 years ago

I have also met this problem and have to replace the loss by Focal Loss, I am still on the experiment to make sure it works

leemengxing commented 4 years ago

Have you tried changing the two classifications to multiple classifications for detection tasks instead of proposal tasks? @vhvkhoa

JJBOY commented 4 years ago

There is a statistical ratio in the loss function. It may happen that the number of iou> 0.7 or iou> 0.9 is 0 for small batch sizes, such as 1, which leads to nan. Is this correct?

Yes, you are right.

lyx190 commented 4 years ago

I have met this problem when batchsize was set to 16, anyone of you have some ideas to deal with it?

longchao1 commented 4 years ago

I have met this problem when batchsize was set to 4, anyone of you have some ideas to deal with it?

JJBOY commented 4 years ago

@lyx190 @longchao1
The reason why NAN loss is raised is explained by @leemengxing. There are two simple ways to solve this problem. 1、Set larger batch size 2、When the batch does not has any positive sample because of the threshold in loss function, just jump this batch

lyx190 commented 4 years ago

@lyx190 @longchao1 The reason why NAN loss is raised is explained by @leemengxing. There are two simple ways to solve this problem. 1、Set larger batch size 2、When the batch does not has any positive sample because of the threshold in loss function, just jump this batch

Thanks a lot