Closed DecaYale closed 2 years ago
Hi, I noticed that you add some post-processing in the forward() function of LiteFlowNet:
for i in flows: flows[i] = flows[i] * (20.0 * (0.5 ** (i-1))) # _, _, out_h, out_w = flows[i].shape # flows[i] = torch.nn.functional.interpolate(input=flows[i], size=(raw_h, raw_w), mode='bilinear', align_corners=False) # flows[i][:, 0, :, :] *= float(raw_w) / float(out_w) # flows[i][:, 1, :, :] *= float(raw_h) / float(out_h)
Why is this necessary? 20 seems to be a strange number and I am confused a little bit .
sorry for late reply, it is taken from this re-implementation. https://github.com/sniklaus/pytorch-liteflownet
Hi, I noticed that you add some post-processing in the forward() function of LiteFlowNet:
Why is this necessary? 20 seems to be a strange number and I am confused a little bit .