Closed AlexanderYaroshevichIAC closed 11 months ago
Are you using with keras.mixed_precision.Policy("float16")
? I've actually awared of this issue, but only expecting usage with keras.mixed_precision.Policy("mixed_float16")
... Anyway, it should be fixed now.
Using with keras.mixed_precision.Policy("mixed_float16")
How is it after previous fix? I've replaced all those dtypes with self.compute_dtype
. Is the issue still exists?
Closing, it should work now.
The problems are the lines keras_cv_attention_models/beit/beit.py:88-89 if there we change float32 to float16 it starts working with fp16