haofeixu / aanet

[CVPR'20] AANet: Adaptive Aggregation Network for Efficient Stereo Matching
Apache License 2.0
524 stars 102 forks source link

where is your paper #1

Closed Jinyong-Huang closed 4 years ago

Jinyong-Huang commented 4 years ago

can't find your paper, would you give me a link?

haofeixu commented 4 years ago

Thanks for your interest. Our paper should be available on arXiv tomorrow.

Jinyong-Huang commented 4 years ago

great. 

---Original--- From: "Haofei Xu"<notifications@github.com> Date: Tue, Apr 21, 2020 16:29 PM To: "haofeixu/aanet"<aanet@noreply.github.com>; Cc: "Jinyong-Huang"<543401923@qq.com>;"Author"<author@noreply.github.com>; Subject: Re: [haofeixu/aanet] where is your paper (#1)

Thanks for your interest. Our paper should be available on arXiv tomorrow.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

haofeixu commented 4 years ago

https://arxiv.org/abs/2004.09548

Jinyong-Huang commented 4 years ago

wonderful jobs!

Jinyong-Huang commented 4 years ago

image

Jinyong-Huang commented 4 years ago

run AANet and AANet+ scripts, len(pred_disp_pyramid) is always 5!!!!!!!!!!!!!!

haofeixu commented 4 years ago

Yes, you are right. It's a typo in the comments and will not affect the model.

Jinyong-Huang commented 4 years ago

thank you for your reply!

Jinyong-Huang commented 4 years ago

image

when I modify the max_disp's value, like 48. but can't load pretrained model. the error is "....size mismatch for...." how can i successfully load the AANet+'s pretrained mode if i have to modify the max_disp.

haofeixu commented 4 years ago

You can manually exclude the aggregation part of the model weights and load remaining weights.

Jinyong-Huang commented 4 years ago

为了表达清楚,我用中文发了。 当我在训练其它数据集时,视差范围改成-48到+48,训练过程当中,最后的输出(pred_disp_pyramid)前三项是正常的,都是成倍数关系,但是后两项视差优化的输出结果明显有误。最后发现在求左图误差的时候求得的值特别大。不太明白refinement.py里面的disp_warp操作,如果我改变了视差范围(有负号),请问该怎么修改disp_warp代码。

Jinyong-Huang commented 4 years ago

image image

Jinyong-Huang commented 4 years ago

可能是relu造成的。把负视差变为了0。 disp = F.relu(disp + residual_disp, inplace=True) # [B, 1, H, W]