Open SilvinaR opened 3 years ago
Activation(None, dtype='float32') just makes a cast of the input tensor to the datatype tf.float32. This is needed (and suggested), when tensorflow uses mixed precision. See the TF mixed precision guide for more details (https://www.tensorflow.org/guide/mixed_precision)
Hi, in The Unet class: self.prediction = Sequential([Conv3D(num_labels, [1] * 3, name='prediction', kernel_initializer=heatmap_layer_kernel_initializer, activation=None, data_format=data_format, padding=padding), Activation(None, dtype='float32', name='prediction')])
Why do you use Activation(None) ? What does it do ? Is this a convolutional operation or a fully connected ?
Thanks !