Currently, the UNet does not have an activation function at the output layer.
Therefore the output values are in the interval of -\inf to inf.
Since depth values are larger or equal to zero, it would make sence to apply a output activation function, e.g . ReLU.
When normalization is used, the output function may be sigmoid!
Added a configuration option to choose from a output activation function.
The configuration option is called output_activation and must be added to network_config in the config.yml.
Possible values are
Currently, the UNet does not have an activation function at the output layer. Therefore the output values are in the interval of -\inf to inf. Since depth values are larger or equal to zero, it would make sence to apply a output activation function, e.g . ReLU.
When normalization is used, the output function may be sigmoid!