Open nikhilsab opened 3 years ago
Did you use unrolled backward? I find there will be no gradients w.r.t ops weights and probabilities if we don't use unrolled. In the author's implementation, the backward process for updating architecture parameters seems not to be relaxed.
@nikhilsab did you resolve this problem ?
Yes, thank you for reaching out. I was able to resolve it!
On Wed, Jun 8, 2022 at 5:51 AM ZengYaoJ @.***> wrote:
@nikhilsab https://github.com/nikhilsab did you resolve this problem ?
— Reply to this email directly, view it on GitHub https://github.com/VDIGPKU/DADA/issues/24#issuecomment-1149814859, or unsubscribe https://github.com/notifications/unsubscribe-auth/AU45IXIDAFLAD5CAARIAPG3VOCCLLANCNFSM5AY6LL6A . You are receiving this because you were mentioned.Message ID: @.***>
-- Regards.
Nikhil Abhyankar Graduate Student Virginia Tech +1 5404497698
@nikhilsab can you tell me how to resolve it ? thank you!!!
Hello,
Apologies for the late reply. I think using the unrolled weights solved the problem. I had done my implementation incorrectly.
Nikhil Abhyankar MS in CPE Virginia Tech +1 5404497698
On Fri, Jun 10, 2022, 8:18 PM ZengYaoJ @.***> wrote:
@nikhilsab https://github.com/nikhilsab can you tell me how to resolve it ? thank you!!!
— Reply to this email directly, view it on GitHub https://github.com/VDIGPKU/DADA/issues/24#issuecomment-1152836746, or unsubscribe https://github.com/notifications/unsubscribe-auth/AU45IXP6NHPCGRF3RLFRGJLVOPZQZANCNFSM5AY6LL6A . You are receiving this because you were mentioned.Message ID: @.***>
As a part of my personal research, I am working on studying various automated data augmentation techniques. Thus, while trying to reproduce your results, I am facing issues with the updation of probabilities and op_weights. Only the value of the magnitude is getting updated over epochs and the probabilities and ops_weights are remaining constant throughout the runtime at values 0.5 and 0.0095 respectively. I would like to request you to kindly help me rectify this issue!
Thanking you.