NifTK / NiftyNet

[unmaintained] An open-source convolutional neural networks platform for research in medical image analysis and image-guided therapy
http://niftynet.io
Apache License 2.0
1.37k stars 404 forks source link

Layers should allow more options than just 'batch norm' and 'group norm' (instance norm?) #280

Open Zach-ER opened 5 years ago

Zach-ER commented 5 years ago

Layers have a with_bn option that is over-ridden for group normalization by having a positive group size. Instead of this flag, we should have a bn_type string that determines which type of normalization to apply.

tvercaut commented 5 years ago

I find the variable name bn_type confusing if you end up not using BN... Maybe replace it with featnorm_type or something along these lines?

tvercaut commented 5 years ago

Should we take the opportunity of this PR to address #285 at the same time?

In the spirit of TF, we could also maybe go for feature_normalization as a flag name (see discussions in https://github.com/NifTK/NiftyNet/pull/282).

wyli commented 5 years ago

fixing #285 will break some of the model zoo items because of the variable name scopes... So we need another PR for #285, probably updating the model zoo items as well.

LucasFidon commented 5 years ago

To my understanding, Instance Normalization is a special case of Group Normalization when group_size is equal to 1. So one can already use Instance Normalization in the current setting.

On this line, maybe the class InstanceNormLayer in niftynet.layer.bn could be removed or at least marked as deprecated.

I agree the flags could be made more clear though.