wavefrontshaping / complexPyTorch

A high-level toolbox for using complex valued neural networks in PyTorch
MIT License
610 stars 148 forks source link

Memory Leak Issue: ComplexBatchNorm2d #1

Closed ws-choi closed 2 years ago

ws-choi commented 5 years ago

Hello. I'm implementing a Deep Complex NN based model. I appreciate that I could quickly implement the model, thanks to you.

However, I think I found an improvement point in your code. I think ComplexBatchNorm2d (or possibly 1d, but I didn't check it) might cause RAM leaks.

Memory leaks are not critical. It continuously and slightly grows in RAM after each epoch. However, I encountered an error: python kernel is going to die if there is no space.

You might want to see this link.

After small modification, it works. I just added some .detach()s in your code. I'm not sure this is correct theoretically, but there are no longer memory leaks after the modification. I'll send you a pull request!

Thank you.

wavefrontshaping commented 4 years ago

Indeed, I am aware of the problem and I discussed it here: https://discuss.pytorch.org/t/how-does-batchnorm-keeps-track-of-running-mean/40084

I fixed it a while ago but somehow rolled it back unintentionally, sorry about that.

I quickly change the files, tell me if it works for you.