neuralmagic / deepsparse

Sparsity-aware deep learning inference runtime for CPUs
https://neuralmagic.com/deepsparse/
Other
2.97k stars 171 forks source link

Add CLIP model to enable test_clip.py #1500

Open mgoin opened 8 months ago

mgoin commented 8 months ago

Since we've made a quantized CLIP (https://huggingface.co/neuralmagic/CLIP-ViT-B-32-256x256-DataComp-s34B-b86K-quant-ds), let's use it to test our pipeline!

mgoin commented 8 months ago

@dsikka it is just made for image/text retrieval and zero-shot image classification. I left in the pytest.skip for the captioning test as a result. This model was made from OpenCLIP using a special branch that @ohaijen worked on. This was made as a collaboration so we're focused on getting results and pipelines working quickly - this is why we pushed to HF and everything to run the model is pretty much self-contained within that model card and notebooks. My opportunistic thinking here was that we could use the model we're using on HF for some active testing on DeepSparse. I don't think there are plans on pushing CLIP models to SparseZoo at the moment, but if you think it's necessary for testing we could try to get a model up there.