caiyuanhao1998 / Retinexformer

"Retinexformer: One-stage Retinex-based Transformer for Low-light Image Enhancement" (ICCV 2023) & (NTIRE 2024 Challenge)
https://arxiv.org/abs/2303.06705
MIT License
828 stars 64 forks source link

关于Illumination-Guided Attention Block的第一个 Layer Norm 模块 #89

Closed qulishen closed 2 months ago

qulishen commented 2 months ago

我看见代码里面的 IGAB 里面的是这样:

        for _ in range(num_blocks):
            self.blocks.append(nn.ModuleList([
                IG_MSA(dim=dim, dim_head=dim_head, heads=heads),
                PreNorm(dim, FeedForward(dim=dim))
            ]))

第一个 LayerNorm (IG_MSA前面的)似乎遗失了?因为我看论文中有绘制。

caiyuanhao1998 commented 2 months ago

hello, 感谢关注,第一个 Layer Norm 是在 IG-MSA 内部实现的: https://github.com/caiyuanhao1998/Retinexformer/blob/master/basicsr/models/archs/RetinexFormer_arch.py#L166

        q = F.normalize(q, dim=-1, p=2)
        k = F.normalize(k, dim=-1, p=2)

如果觉得我们的 repo 有用的话,帮忙点点 star 支持一下,感谢 :)