jcjohnson / densecap

Dense image captioning in Torch
MIT License
1.58k stars 430 forks source link

`total_loss` is greater than the sum of the losses #32

Closed agude closed 8 years ago

agude commented 8 years ago

total_loss is often larger than the sum of all the loses, for example here is the output of a training run:

Loss stats:    
{
  mid_box_reg_loss : 0.0019056373813539
  captioning_loss : 5.0455059438944
  end_box_reg_loss : 0.002998169460654
  end_objectness_loss : 0.038921515512466
  total_loss : 10.398667634428
  mid_objectness_loss : 0.11150163569537
}
Average loss:     10.398667634428    

Note that 10.3... is not the sum of the other losses in the losses table. I think there is a bug here: https://github.com/jcjohnson/densecap/blob/master/densecap/DenseCapModel.lua#L455-L458

total_loss is added to the losses table, the table is then iterated over, and all values are added to the total_loss entry. At some point in that for loop it will look like this: total_loss = total_loss + total_loss.