YU1ut / openset-DA

Code for 'Open Set Domain Adaptation by Backpropagation'
71 stars 26 forks source link

question about the setting of lambd(constant) for grad_reverse #7

Open fuyimin96 opened 2 years ago

fuyimin96 commented 2 years ago

In the office code of OSBP the p(constant) is set to 1 and then -1grad to inverse the grad, but in your code this seems to be converge to 1 by the code in 88,89 in train.py p = global_step / total_steps constant = 2. / (1. + np.exp(-10 p)) - 1 What is the reason for this operation and what is the meaning of grl-rampup-epochs? thanks for your reply!

YU1ut commented 2 years ago

It is a common operation when using gradient reversal. See Eq. 14 in the following paper. https://arxiv.org/pdf/1409.7495.pdf

fuyimin96 commented 2 years ago

It is a common operation when using gradient reversal. See Eq. 14 in the following paper. https://arxiv.org/pdf/1409.7495.pdf

thanks for your reply.