Closed ksl1231 closed 2 years ago
我大概了解为什么w和h不一致,但是我同时发现batch_size大于1时依旧是共享同一个可学习参数weightself.complex_weight = nn.Parameter(torch.randn(dim, h, w, 2, dtype=torch.float32) * 0.02)
和x = x * weight
,这样batch内不同图像之间不会相互干扰吗
Hi @ye40, thanks for your interest in our work. The FFT of a real signal is Hermitian-symmetric, so there are only (h, h//2 + 1) independent complex values in the rFFT of a (h, h) tensor. Please refer to the discussions around Eq. 3.7 in our paper for more details.
Thank you for your excellent work.
I'm curious why w is set to
w = h // 2 + 1
, or just the experiment proves that it is better to setw = h // 2 + 1
in this way