SakurajimaMaiii / TSD

[CVPR 2023] Feature Alignment and Uniformity for Test Time Adaptation
https://arxiv.org/abs/2303.10902
MIT License
41 stars 1 forks source link

optimizer = torch.optim.Adam(algorithm.parameters(),lr=args.lr) #11

Closed jis3613 closed 11 months ago

jis3613 commented 11 months ago

I think there is an error in code unsupervised_adapt.py line 198 optimizer = torch.optim.Adam(algorithm.parameters(),lr=args.lr) Due to above line, no matter which "update_param" is chosen, the model will always update all parameters. I think line 198 should be commented out. I can be wrong, but I just want you to check if this is right.

Thank you for sharing your work.

SakurajimaMaiii commented 11 months ago

I think what you said is right. Thanks for pointing this. I will fix it as soon as possible