zzangjinsun / NLSPN_ECCV20

Park et al., Non-Local Spatial Propagation Network for Depth Completion, ECCV, 2020
MIT License
321 stars 57 forks source link

Some inquires about the confidence offset calculation #38

Closed Magicboomliu closed 2 years ago

Magicboomliu commented 2 years ago

Hi jinsun! I looking through #16 . And your lastest version.
if I understand correctly, for a new training model, there is no need to compute the relative coordinate ww and hh. I think there should better use a if judgement for reduce the computional complexity? Here:


                ww = idx_off % self.k_f
                hh = idx_off // self.k_f

                if ww == (self.k_f - 1) / 2 and hh == (self.k_f - 1) / 2:
                    continue

                offset_tmp = offset_each[idx_off].detach()

                # NOTE : Use --legacy option ONLY for the pre-trained models
                # for ECCV20 results.
                if self.args.legacy:
                    offset_tmp[:, 0, :, :] = \
                        offset_tmp[:, 0, :, :] + hh - (self.k_f - 1) / 2
                    offset_tmp[:, 1, :, :] = \
                        offset_tmp[:, 1, :, :] + ww - (self.k_f - 1) / 2

                conf_tmp = ModulatedDeformConvFunction.apply(
                    confidence, offset_tmp, modulation_dummy, self.w_conf,
                    self.b, self.stride, 0, self.dilation, self.groups,
                    self.deformable_groups, self.im2col_step)
                list_conf.append(conf_tmp)

            conf_aff = torch.cat(list_conf, dim=1)
            aff = aff * conf_aff.contiguous()```    
zzangjinsun commented 2 years ago

The calculation you mentioned is still needed to "skip" the reference at the patch center. You may check how aff and aff_ref are calculated to understand this process!

Magicboomliu commented 2 years ago

Ok, yeah, I understand. thx