Closed Ujjawal-K-Panchal closed 3 years ago
No, cplxmodule does not use torch's complex dtype. In issue #2 I outline the mathematics behind the grads computed in cplxmodule
. Briefly, we emulate the complex arithmetic and derivation, and compute the final dL / d conj(z)
for a real-valued loss w.r.t. parameter z
, that can be seamlessly used in standard optimizers in torch.optim.*
. Internally the computation is perfomed using way 1 from here : Wirtinger calculus is convenient for computing the grads by hand, but the split approach allows leveraging existing real-valued autograd.
Thanks for the answer @ivannz!
I recently read about Pytorch autograd supporting complex differentiation using Wirtinger (CR) Calculus here for it;s torch.complex datatype. However, the same document also talks about a different split way of computing gradients. I have been using cplxmodule, however, I'm not sure about calculation of gradients in the framework. Does the cplxmodule use the Pytorch complex autograd differentiation for calculation of gradients for a Cplx Tensors?
In summary, the complex autograd gradient is given by:
where:
Hence, the entire eqn. reduces to the following eqn., since the
backward()
= 1 for scalar output.