Open life97 opened 3 years ago
The gradient disappeared. This means that the value of your loss function is too tiny. You need to coordinate the size ratio of each loss function. It's a result of trying. Please coordinate the value of each loss function to a unified order of magnitude!
Thank you very much for this interesting work. I added the diou to the IOU loss as a penalty item, but the network loss reached NaN during training?
I also met this stituation, do you solve it?
Thank you very much for this interesting work. I added the diou to the IOU loss as a penalty item, but the network loss reached NaN during training?