wavefrontshaping / complexPyTorch

A high-level toolbox for using complex valued neural networks in PyTorch
MIT License
610 stars 148 forks source link

Use native Pytorch operations for complex numbers #19

Open anjali411 opened 2 years ago

anjali411 commented 2 years ago

Hi I noticed that you have custom matmul (https://github.com/wavefrontshaping/complexPyTorch/blob/a4e752caf827f3b642960366a7e9420f308076cc/complexPyTorch/complexFunctions.py#L11-L19) and tanh, neg functions defined (https://github.com/wavefrontshaping/complexPyTorch/blob/a4e752caf827f3b642960366a7e9420f308076cc/complexPyTorch/complexFunctions.py#L52-L56) which are actually unnecessary since these functions are supported for complex numbers in the last couple of releases of PyTorch and would be much faster too (since we call into blas operation for matmul for example).

wavefrontshaping commented 2 years ago

Hi, Before version 1.7, complex tensors were not implemented at all, since then, various updates have introduced complex tensors and now, almost all operations are implemented and, very recently and very importantly, autograd works for complex tensors.

So basically, complexPyTorch is now obsolete, you can use pyTorch out of the box with complex tensors like you use real tensors and everything should work. I keep this repository for backward compatibility with old code or if one needs to use an old version of pyTorch for some reason.

SantaTitular commented 2 years ago

Hi @wavefrontshaping @anjali411 ,

Could you elaborate a bit more on how to use pyTorch out of the box with complex tensors? For instance in this library, Keras CVNN (similar to yours but with Keras Tenserflow), the activation functions are well depicted and explained. However, with pytorch, I can't seem to properly access the code to check which implementation of the layers are available (#47052). For instance, with ReLu and TanH, there is different implementations available, how to I differentiate between them?