1 what is the meaning of alpha_bbox?
2 why is the c_loss retain_graph? c_loss.backward(retaingraph=True)
3 what is the overall formula of the loss? I am really lost in the reshaped* staff.
4 why is the loss backward for each other? should we make a total_loss and backward the total_loss all in one?
really good example by the way. but I could not get the meaning of the loss definition below is the code between https://github.com/jeremyfix/deeplearning-lectures/blob/a2535c6969b644914af2a47cd60aa4e6d02af1fc/LabsSolutions/01-pytorch-object-detection/utils.py#L181 and https://github.com/jeremyfix/deeplearning-lectures/blob/a2535c6969b644914af2a47cd60aa4e6d02af1fc/LabsSolutions/01-pytorch-object-detection/utils.py#L225
1 what is the meaning of alpha_bbox? 2 why is the c_loss retain_graph? c_loss.backward(retaingraph=True) 3 what is the overall formula of the loss? I am really lost in the reshaped* staff. 4 why is the loss backward for each other? should we make a total_loss and backward the total_loss all in one?
thanks, all to you. really learn a lot from this