yjsunnn / FBANet

Official implementation of ICCV2023 "Towards Real-World Burst Image Super-Resolution: Benchmark and Method"
MIT License
113 stars 7 forks source link

The effectiveness of Federated Affinity Fusion #8

Closed qtct closed 9 months ago

qtct commented 10 months ago

Hello, I made a comparison between the fusion modes of FAF and TSA, and found that module FAF contains more residual blocks and convolution with a higher number of channels in the spatial attention, resulting in a much higher parameter number than TSA. When I replaced the spatial attention in FAF with the spatial attention in TSA, the performance was worse than in TSA. Is the performance of fusion method FAF better than TSA in this paper due to the large number of parameters rather than the FAF better than VAF proposed in this paper? Thanks for your reply.

yjsunnn commented 10 months ago

Hi, thanks a lot for your interest of our work. For the TSA, we dub it as TSA for quick understanding while it actually just borrows the core idea of original TSA in EDVR (the core idea means assigning weights based on similarity). The only difference of TSA used in our paper and FAF is whether to use the difference affinity maps, and the embedding dimension and the network blocks are totally the same.

Maybe you can leave your email and I will send you the code of TSA model (Or you can just use the corr_l in our code as the weights rather than corr_diff).

qtct commented 10 months ago

Hi, thanks a lot for your interest of our work. For the TSA, we dub it as TSA for quick understanding while it actually just borrows the core idea of original TSA in EDVR (the core idea means assigning weights based on similarity). The only difference of TSA used in our paper and FAF is whether to use the difference affinity maps, and the embedding dimension and the network blocks are totally the same.

Maybe you can leave your email and I will send you the code of TSA model (Or you can just use the corr_l in our code as the weights rather than corr_diff).

Thank you very much. I will test it again.