Open MatthewInkawhich opened 6 years ago
Hi, I encounter the same issue too, and I fixed it by using this, hope it can help you!
Hi @PK15946 , thanks for the information. Do you want to make a pull request?
@MatthewInkawhich Hi, have you fixed this problem?
@PK15946 Hi~, thanks for your information, but I can not open the link you provided, so did you remember how to fix this problem?
@liuqk3 Hi, no I did not end up using this repo. The code that I was trying to run worked on https://github.com/NVIDIA/flownet2-pytorch. This repo is based on NVIDIA's implementation anyway.
@MatthewInkawhich I figured out the reason. There is something wrong in the file ./FlowNet2_src/models/components/ops/channelnorm/functions/channelnorm.py
. Compared with the file here, we can find that the defination of backward function calls ctx.norm_deg
, which is not defined in forward fucntion, so we can simply add the setence ctx.norm_deg = norm_deg
in the defination of forward fucntion before return. Then the model can be trained :). But here comes another problem: the model can not converge with the lr = 1e-4 :(, now I am trying lr = 1e-5.
of backward function calls
ctx.norm_deg
, which is not defined in forward fucntion, so we can simply add the setencectx.norm_deg = norm_deg
in the defination of forward fucntion before return. Then the model can be trained :). But here comes another problem: the model can not converge with the lr = 1e-4 :(, now I am trying lr = 1e-5.
Did decreasing learning rate help? I encountered the same problem in VGG and by freezing some weight it solved
hi,Can you post this link again? Or your solution? Thank you very much!
I am trying to back-propagate a custom gradient tensor through the FlowNet2 model. I know that this is possible in PyTorch using the following methodology:
I am trying to replicate this with FlowNet2. Here is the relevant code snippet:
However, when I attempt to run this, I encounter an error at line:
curr_flownet_out.backward(custom_grad)
Any ideas as to how I can successfully use PyTorch's autograd feature to propagate a custom gradient tensor through FlowNet2?
Thanks!