Closed mayank010698 closed 11 months ago
Hi,
Thanks for pointing this out. This is not intentional. The head weights are supposed to be excluded from training. Please change to the exclude_list=["module.head.weight", "module.head.bias"] because they are randomly initalized. I will also update it shortly.
Best,
https://github.com/PotatoTian/TPGM/blob/82f0eb0976b230b9b777e4910dfd0c37fc4186e6/DomainNet_ResNet_Exp/main_finetune.py#L127
Hi, in line https://github.com/PotatoTian/TPGM/blob/82f0eb0976b230b9b777e4910dfd0c37fc4186e6/DomainNet_ResNet_Exp/main_finetune.py#L108,
will change the name of the parameters from
head.bias
tomodule.head.bias
. As a result, TPGM parameters will also be learned for final layer's weight and biases? Is this intentional?