Hi,
I was trying to plot the gradient norm and weight clipping for each layer as in the paper "Improved Training of Wasserstein GANs". But I am stuck on how to take the gradient norm of each layer.
Can someone give some idea or help how it can be done?
Hi, I was trying to plot the gradient norm and weight clipping for each layer as in the paper "Improved Training of Wasserstein GANs". But I am stuck on how to take the gradient norm of each layer. Can someone give some idea or help how it can be done?
Thank you