gathierry / FastFlow

Apache License 2.0
124 stars 22 forks source link

Unmentioned but critical LayerNorm #3

Open gathierry opened 2 years ago

gathierry commented 2 years ago

To achieve comparable result as the original paper. LayerNorm is applied to the feature before NF. This is never mentioned in the paper and the usage is very tricky (but this is the only way works for me):

cytotoxicity8 commented 2 years ago

I measured the performances of models without LayerNorm parts. In both renset18 and wide-resnet50, AUROC was quite similar, sometimes even better the original ones. Also DeiT showed comparable performances. (lower as 0.03~0.05) However in CaiT, the loss was crazily high and AUROC was 0.5! I can't understand why these models show different results depending on Layer Normalization.

cytotoxicity8 commented 2 years ago

image The red one is w/o elementwise-affine. I am experimenting to advance FastFlow, discussion is always open.

AncientRemember commented 1 year ago

use x = x.flatten(2).transpose(1, 2) to reshape the featuremap BCHW -->B,N,C,thus layerNorm don't depend the input size

AncientRemember commented 1 year ago

maybe use BN after conv2d will work

gathierry commented 1 year ago

Well, after learning more about transformers, I realize that adding LayerNorm to intermediate output feature maps is very commom, such as applying transformers as the backbone in semantic segmentation (https://github.com/SwinTransformer/Swin-Transformer-Semantic-Segmentation/blob/87e6f90577435c94f3e92c7db1d36edc234d91f6/mmseg/models/backbones/swin_transformer.py#L620). So I guess that's why the paper never mentioned.

And for resnet, maybe LayerNorm is not necessary as pointed out by @cytotoxicity8 .