leondgarse / keras_cv_attention_models

Keras beit,caformer,CMT,CoAtNet,convnext,davit,dino,efficientdet,edgenext,efficientformer,efficientnet,eva,fasternet,fastervit,fastvit,flexivit,gcvit,ghostnet,gpvit,hornet,hiera,iformer,inceptionnext,lcnet,levit,maxvit,mobilevit,moganet,nat,nfnets,pvt,swin,tinynet,tinyvit,uniformer,volo,vanillanet,yolor,yolov7,yolov8,yolox,gpt2,llama2, alias kecam
MIT License
595 stars 95 forks source link

eva02 fp16 not working #122

Closed AlexanderYaroshevichIAC closed 11 months ago

AlexanderYaroshevichIAC commented 1 year ago

The problems are the lines keras_cv_attention_models/beit/beit.py:88-89 if there we change float32 to float16 it starts working with fp16

leondgarse commented 1 year ago

Are you using with keras.mixed_precision.Policy("float16")? I've actually awared of this issue, but only expecting usage with keras.mixed_precision.Policy("mixed_float16")... Anyway, it should be fixed now.

AlexanderYaroshevichIAC commented 1 year ago

Using with keras.mixed_precision.Policy("mixed_float16")

leondgarse commented 1 year ago

How is it after previous fix? I've replaced all those dtypes with self.compute_dtype. Is the issue still exists?

leondgarse commented 11 months ago

Closing, it should work now.