sefibk / KernelGAN

Other
337 stars 77 forks source link

Why there needs flip orperation when calculate the kernel ? #79

Open 123abcgit opened 1 year ago

123abcgit commented 1 year ago

Hi, Thanks for sharing the great work, which is very clear. However, I am confused about why there needs flip operation at the end of calc_curr_k function in KernelGAN.py? Looking forward to your reply. Thank you very much. def calc_curr_k(self): """given a generator network, the function calculates the kernel it is imitating""" delta = torch.Tensor([1.]).unsqueeze(0).unsqueeze(-1).unsqueeze(-1) for ind, w in enumerate(self.G.parameters()): curr_k = F.conv2d(delta, w, padding=self.conf.G_kernel_size - 1) if ind == 0 else F.conv2d(curr_k, w) self.curr_k = curr_k.squeeze().flip([0, 1])

sefibk commented 1 year ago

Hope I understand your question. If so, in order to extract the kernel of G, we pass a delta (1 in the center, 0 everywhere else) through G. Only then we get the kernel flipped, so we must unflip it. Try it even with a 1 layer G and you will notice the kernel flipping Hope that helps