alr-internship / self-supervised-depth-denoising

Denoising YCB Objects with a self-supervised deep neural network
MIT License
9 stars 0 forks source link

output activation function #1

Closed claudius-kienle closed 2 years ago

claudius-kienle commented 2 years ago

Currently, the UNet does not have an activation function at the output layer. Therefore the output values are in the interval of -\inf to inf. Since depth values are larger or equal to zero, it would make sence to apply a output activation function, e.g . ReLU.

When normalization is used, the output function may be sigmoid!

claudius-kienle commented 2 years ago

Added a configuration option to choose from a output activation function. The configuration option is called output_activation and must be added to network_config in the config.yml. Possible values are