Closed egmaminta closed 1 week ago
sorry for the confusion. reg means the regularization term after the lambda coefficient. Regularization could include l1, entropy, etc.
Oh I see... if reg
is the regularization term, shouldn't it be equal to zero if I set the lambda coefficient (the coefficient that refers to the overall regularization magnitude) equal to zero as well?
oh sorry, so the displayed one is before applying the coefficient lamb, i.e., real_reg = lamb * reg
, total_loss = pred_loss + real_reg
Hi! I would like to ask what does
reg
mean when training KAN? For example you would see something like:train loss: xxx | test loss: yyy | reg: zzz : ...
And why is it always changing? Does this refer to the lambda or overall regularization parameter?