Open Happypat-cloud opened 3 years ago
I want to know how you deal with the batchnorm in deeplabv3plus head. As I know, AdaSeg freeze the bn layers which pretrained on imagenet in backbone. In the deeplabv2 head, there are no bn layers
I want to know how you deal with the batchnorm in deeplabv3plus head. As I know, AdaSeg freeze the bn layers which pretrained on imagenet in backbone. In the deeplabv2 head, there are no bn layers