zhangganlin / Improved-PSMNet-for-Deep-Stereo-Disparity-Estimation

Improved-PSMNet: Combining PSM, group-wise corr, dilatedResNet, segmentation to estimate accurate disparity of stereo image pairs efficiently.
Other
9 stars 1 forks source link

请教代码 #2

Open zzr0127 opened 10 months ago

zzr0127 commented 10 months ago

您好博主,代码我读的不是很清楚,models中的dilatedcostfiltering和stackhourglass两个近乎相似,最后训练用的哪一个,语义分割模块是单独的,怎么加上去的,想请教一下您,我能加您的联系方式嘛,或者您可以给我您的邮箱,如能告知,万分感谢。

zhangganlin commented 10 months ago

Hi, We use dilatedcostfiltering in training, stackhourglass is for comparison in the report. And for semantic segmentaion, we first input the left image to the segmentation pre-trained network (all parameter fixed), which is mainly from this github repo. Then we concatenate the segmentation result and the spatial volume to form the cost volume (shown in fig1 of our report). Since this project was finished two years ago, I am not sure about the very detailed implementation either, but we mainly used the orginal PSMNet as reference, it might be helpful to read the orignal paper and code first if this repo is not clear enough for you. Hope I answered your question somehow, if not, feel free to ask again :)

Best, Ganlin

zzr0127 commented 10 months ago

Thanks for your reply,Currently I can't find the seg-related code and pre-trained network,Can you give me a little detailed hint?Thanks again for your reply.

Best,

来日方长 @.***

 

------------------ 原始邮件 ------------------ 发件人: "zhangganlin/Improved-PSMNet-for-Deep-Stereo-Disparity-Estimation" @.>; 发送时间: 2023年8月30日(星期三) 晚上7:32 @.>; @.**@.>; 主题: Re: [zhangganlin/Improved-PSMNet-for-Deep-Stereo-Disparity-Estimation] 请教代码 (Issue #2)

Hi, We use dilatedcostfiltering in training, stackhourglass is for comparison in the report. And for semantic segmentaion, we first input the left image to the segmentation pre-trained network (all parameter fixed), which is mainly from this github repo. Then we concatenate the segmentation result and the spatial volume to form the cost volume (shown in fig1 of our report). Since this project was finished two years ago, I am not sure about the very detailed implementation either, but we mainly used the orginal PSMNet as reference, it might be helpful to read the orignal paper and code first if this repo is not clear enough for you. Hope I answered your question somehow, if not, feel free to ask again :)

Best, Ganlin

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

zhangganlin commented 10 months ago

The segmentation network is inside this folder semantic_segmentation For the usage, check generate_seg.py, also, there is a official README.md inside semantic_segmentation For the pretrained segmentation network, check download_pretrained_model.sh By the way, train.sh provide how to run the whole pipeline. All these details are already provided in the README.md. Please have a check.

Best, Ganlin

zzr0127 commented 10 months ago

  Hi,        Thank you very much for your reply. I am not very familiar with the segmentation module, and I am preparing to reread it.      Best,

Thank you very much for your reply. I am not very familiar with the segmentation module, and I am preparing to reread it,

来日方长 @.***

 

------------------ 原始邮件 ------------------ 发件人: "zhangganlin/Improved-PSMNet-for-Deep-Stereo-Disparity-Estimation" @.>; 发送时间: 2023年8月30日(星期三) 晚上10:17 @.>; @.**@.>; 主题: Re: [zhangganlin/Improved-PSMNet-for-Deep-Stereo-Disparity-Estimation] 请教代码 (Issue #2)

The segmentation network is inside this folder semantic_segmentation For the usage, check generate_seg.py, also, there is a official README.md inside semantic_segmentation For the pretrained segmentation network, check download_pretrained_model.sh By the way, train.sh provide how to run the whole pipeline. All these details are already provided in the README.md. Please have a check.

Best, Ganlin

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>