Closed Jinyong-Huang closed 4 years ago
Thanks for your interest. Our paper should be available on arXiv tomorrow.
great.
---Original--- From: "Haofei Xu"<notifications@github.com> Date: Tue, Apr 21, 2020 16:29 PM To: "haofeixu/aanet"<aanet@noreply.github.com>; Cc: "Jinyong-Huang"<543401923@qq.com>;"Author"<author@noreply.github.com>; Subject: Re: [haofeixu/aanet] where is your paper (#1)
Thanks for your interest. Our paper should be available on arXiv tomorrow.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
wonderful jobs!
run AANet and AANet+ scripts, len(pred_disp_pyramid) is always 5!!!!!!!!!!!!!!
Yes, you are right. It's a typo in the comments and will not affect the model.
thank you for your reply!
when I modify the max_disp's value, like 48. but can't load pretrained model. the error is "....size mismatch for...." how can i successfully load the AANet+'s pretrained mode if i have to modify the max_disp.
You can manually exclude the aggregation part of the model weights and load remaining weights.
为了表达清楚,我用中文发了。 当我在训练其它数据集时,视差范围改成-48到+48,训练过程当中,最后的输出(pred_disp_pyramid)前三项是正常的,都是成倍数关系,但是后两项视差优化的输出结果明显有误。最后发现在求左图误差的时候求得的值特别大。不太明白refinement.py里面的disp_warp操作,如果我改变了视差范围(有负号),请问该怎么修改disp_warp代码。
可能是relu造成的。把负视差变为了0。 disp = F.relu(disp + residual_disp, inplace=True) # [B, 1, H, W]
can't find your paper, would you give me a link?