sniklaus / pytorch-liteflownet

a reimplementation of LiteFlowNet in PyTorch that matches the official Caffe version
GNU General Public License v3.0
412 stars 80 forks source link

Question on flow normalizing #13

Closed ghost closed 5 years ago

ghost commented 5 years ago

Hello, in issue #11, Is there any reason why do scaling with ((tensorInput.size(3) - 1.0) / 2.0) instead of (tensorInput.size(3) - 1.0)? Does the line assume that the input tensorflow stay in [-(w-1)/2, (w-1)/2] so result will be [-1, 1]?


Thank you for critically examining my code! Please note that the estimated tensorFlow itself can have negative values and the only reason for the division is to scale tensorFlow in accordance with the image size. Try your suggested version, execute the provided run.py, and examine the result, it will probably have little to no meaning anymore.

arc130 commented 5 years ago

I think so. You can see the document of pytorch's grid_sample.

ghost commented 5 years ago

hi, @src130. Yes, I checked the grid_sample and it said most grid should be in [-1, 1].

the thing I could not find description about was: why input flow was assumed to had range of [-(w-1)/2, (w-1)/2] not [-w, w] which leaded to normalization with ((tensorInput.size(3) - 1.0) / 2.0) instead of (tensorInput.size(3) - 1.0).

arc130 commented 5 years ago

That is why I submit the issue in #11. I think the Backward_tensorGrid gets the coordinates of a graph, with the origin in the upper left corner of the image. Optical flow represents the offset in each position. So, Backward_tensorGrid + tensorFlow -> grid is the grid in range [0, h]/[0, w], and 2 * (grid / (h, w)) - 1 is to scale grid in range[-1, 1]. Also, in PWC-Net warp function, it is same with my opinion.

sniklaus commented 5 years ago

Thank you for chiming in @arc130! In other words, sampling a leftmost pixel from a rightmost pixel would require the corresponding grid value in in grid_sample to be 1 since the range of grid is [-1, 1]. The precomputed Backward_tensorGrid is a sampling grid where the leftmost value is -1 though, we hence have to add 2 to it. My apologies for not being able to explain this in more detail but I am unfortunately quite busy. Other examples of where I successfully have done this (since 2017): pytorch-spynet, pytorch-pwc

arc130 commented 5 years ago

Oh, thanks for your reply @sniklaus ! I find it's my mistake. Backward_tensorGrid has been scaled in range [-1, 1]. Those algorithms can get the same results.

ghost commented 5 years ago

Thanks for reply :)

On Wed, Jul 24, 2019, 10:04 Simon Niklaus notifications@github.com wrote:

Closed #13 https://github.com/sniklaus/pytorch-liteflownet/issues/13.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/sniklaus/pytorch-liteflownet/issues/13?email_source=notifications&email_token=AMR5FVWFZ3CSDTWJIKIEGQDQA6TCBA5CNFSM4IF6YPGKYY3PNVWWK3TUL52HS4DFWZEXG43VMVCXMZLOORHG65DJMZUWGYLUNFXW5KTDN5WW2ZLOORPWSZGOSVGFOIQ#event-2504808226, or mute the thread https://github.com/notifications/unsubscribe-auth/AMR5FVW6GEDGVXIVW4TID4TQA6TCBANCNFSM4IF6YPGA .