leondgarse / keras_cv_attention_models

Keras beit,caformer,CMT,CoAtNet,convnext,davit,dino,efficientdet,edgenext,efficientformer,efficientnet,eva,fasternet,fastervit,fastvit,flexivit,gcvit,ghostnet,gpvit,hornet,hiera,iformer,inceptionnext,lcnet,levit,maxvit,mobilevit,moganet,nat,nfnets,pvt,swin,tinynet,tinyvit,uniformer,volo,vanillanet,yolor,yolov7,yolov8,yolox,gpt2,llama2, alias kecam
MIT License
595 stars 95 forks source link

Model conversion error (MobileViT) #74

Closed mhyeonsoo closed 1 year ago

mhyeonsoo commented 2 years ago

Hi @leondgarse,

I feel sorry to write another issue regarding model conversion.

The reason is because I start facing an error after updating keras_cv_attention_models package to the latest one which you currently committed for tflite conversion.

I saw that you added a reshaping line before self attention layer. But after this, an error occurs like below.

/usr/local/lib/python3.8/dist-packages/tensorflow/python/saved_model/save.py:1369:0: note: Error code: ERROR_NEEDS_FLEX_OPS
<unknown>:0: error: failed while converting: 'main': 
Some ops are not supported by the native TFLite runtime, you can enable TF kernels fallback using TF Select. See instructions: https://www.tensorflow.org/lite/guide/ops_select 
TF Select ops: Conv2D
Details:
    tf.Conv2D(tensor<?x?x?x?xf32>, tensor<1x1x128x256xf32>) -> (tensor<?x?x?x256xf32>) : {data_format = "NHWC", device = "", dilations = [1, 1, 1, 1], explicit_paddings = [], padding = "VALID", strides = [1, 1, 1, 1], use_cudnn_on_gpu = true}

just FYI, before this version package, when I tried model conversion with TFLiteConverter, there hasn't been an error and converted well.

Can you check this issue from your side?

Thanks,

leondgarse commented 2 years ago

Sorry, just back from our vacation, and yes you are right! Just in my tests, it's because I missed another one GroupNorm -> Conv2D usage in MobileVitV2. Fixed in above PR, and it's weird in my last tests, the conversion works normally... Here is my colab test: kecam_test.ipynb. You may try again on your side. Sorry for the late reply. :)

leondgarse commented 1 year ago

Closing now. Please re-open if issue still exists.