jonbarron / robust_loss_pytorch

A pytorch port of google-research/google-research/robust_loss/
Apache License 2.0
656 stars 88 forks source link

Exists a shape calculation bug in AdaptiveImageLossFunction wrapper #14

Closed rvarun7777 closed 4 years ago

rvarun7777 commented 4 years ago

https://github.com/jonbarron/robust_loss_pytorch/blob/9831f1db8006105fe7a383312fba0e8bd975e7f6/robust_loss_pytorch/adaptive.py#L319

The user would have to permute the images and pass to the function. This can be avoided if you kindly change it to _, num_channels, height, width = x.shape Maybe the confusion has raised due to Tensorflow and Pytorch tensor representation.

jonbarron commented 4 years ago

Oh interesting, I didn't realize that Pytorch had a history of using a NCHW memory layout. Too bad there isn't a TF-like way to specify an arbitrary memory layout.

Is there any reason to prefer one layout over the other besides convention?

rvarun7777 commented 4 years ago

Not really, I am fine with it. It is Pytorch's decision to go with this convention though. I just raised my concern as other users might actually not find this bug at this place. The code would fail at this point https://github.com/jonbarron/robust_loss_pytorch/blob/9831f1db8006105fe7a383312fba0e8bd975e7f6/robust_loss_pytorch/wavelet.py#L325 They would have to backtrace it to this point.

jonbarron commented 4 years ago

Oh good, at least there's something of a check on this. I'll keep this issue in mind in case I find some time to work on this codebase, but if you're interested in making the change yourself I'd be happy to merge the PR.

rvarun7777 commented 4 years ago

Sure, I will do a PR and check it. You can merge the correct state.