mazzzystar / Queryable

Run OpenAI's CLIP and Apple's MobileCLIP model on iOS to search photos.
https://queryable.app
MIT License
2.66k stars 421 forks source link

Provide the `.mlmodel` / `.mlpackage` files instead of the `.mlmodelc` ones #7

Closed laclouis5 closed 1 year ago

laclouis5 commented 1 year ago

Currently only the compiled CoreML models (.mlmodelc files) are provided in the Google Drive. Providing the .mlmodel or even better the .mlpackage files would be great since this would allow tweaking the network and playing with quantification options.

mazzzystar commented 1 year ago

Good point, the .mlpackage file has now been added. Tell me if it's not the right model as I've exported the many files with the same .mlpackage name.

laclouis5 commented 1 year ago

One last request, could you share the PyTorch checkpoints / weights of the two models, i.e. the .pt or .pth files?

mazzzystar commented 1 year ago

Here it is: Queryable#core-ml-export. The model ckpt is from CLIP ViT-B/32, and I seperated the TextEncoder and ImageEncoder and exported them.