Closed romitjain closed 1 year ago
Added hardcoding of float32 for the activation layer.
float32
This will prevent overriding from the following call:
from tensorflow.keras import mixed_precision mixed_precision.set_global_policy('mixed_float16')
This can help enable mixed precision training on Unet models
Added hardcoding of
float32
for the activation layer.This will prevent overriding from the following call:
This can help enable mixed precision training on Unet models