greyovo / CLIP-android-demo

A demo for running quantized CLIP model (ViT-B/32) on Android.
29 stars 4 forks source link

Can you upload the onnx model and released APP ? #5

Closed caofx0418 closed 1 week ago

caofx0418 commented 2 months ago

Can you upload the onnx model and released APP ?

clip-image-encoder-quant-int8 clip-text-encoder-quant-int8

thanks1

greyovo commented 2 months ago

Please see PicQuery right here https://github.com/greyovo/PicQuery

willswordh commented 1 month ago

Please see PicQuery right here https://github.com/greyovo/PicQuery

@greyovo Hi great work! May I ask what does resolution of 400px mean in the speed section? does it mean 400 x 400 or 20 x 20 pixels? Thanks a lot!

greyovo commented 1 month ago

It means 400px * 400px.

willswordh commented 3 weeks ago

@greyovo Hi! May I ask which execution provider did you choose after you have quantize the CLIP to exported to ort model? Is it Android Neural Networks API? Thanks a lot!

greyovo commented 1 week ago

@willswordh, I just use the default provider, and I did try the NNAPI and it seems not fit well to the model.