I think there is an error in code unsupervised_adapt.py line 198
optimizer = torch.optim.Adam(algorithm.parameters(),lr=args.lr)
Due to above line, no matter which "update_param" is chosen, the model will always update all parameters.
I think line 198 should be commented out.
I can be wrong, but I just want you to check if this is right.
I think there is an error in code unsupervised_adapt.py line 198
optimizer = torch.optim.Adam(algorithm.parameters(),lr=args.lr)
Due to above line, no matter which "update_param" is chosen, the model will always update all parameters. I think line 198 should be commented out. I can be wrong, but I just want you to check if this is right.Thank you for sharing your work.