Open busyyang opened 4 months ago
Hi, I just find an issue about the loss weights when ofg_epoch=0:
ofg_epoch=0
When do ofg, the loss_ofg and loss_reg is added with weights.
loss_ofg
loss_reg
if ofg_epoch: '''use ofg loss''' ofg_lr = calc_learning_rate(epoch, max_epoch, args.ofg_lr) criterion_ofg = OFGLoss(iter_count=ofg_epoch, reg_weight=weights_opt[1], lr=ofg_lr) loss_ofg = criterion_ofg(x, y, output[1]) * weights_model[0] # <-------- with weights loss_reg = criterion_reg(output[1]) * weights_model[1] # <-------- with weights loss = loss_ofg + loss_reg loss_vals = [loss_ofg, loss_reg]
loss_ncc
else: '''use ncc loss''' loss_ncc = criterion_ncc(output[0], y) # <-------- without weights loss_reg = criterion_reg(output[1]) # <-------- without weights loss = loss_ncc + loss_reg loss_vals = [loss_ncc, loss_reg]
Is there a bug when without OFG?
No because the default weight for unsupervised loss, i.e., NCC + regularization, is 1 : 1, so no weight is needed there. For OFG, we have OFG:Reg = 1:0.02, so there is a weight involved.
Hi, I just find an issue about the loss weights when
ofg_epoch=0
:When do ofg, the
loss_ofg
andloss_reg
is added with weights.loss_ncc
andloss_reg
are added without weights?Is there a bug when without OFG?