LAION-AI / CLIP_benchmark

CLIP-like model evaluation
MIT License
623 stars 80 forks source link

Move support of model types to all_clip module. #118

Open rom1504 opened 10 months ago

rom1504 commented 10 months ago

I created the all_clip module in order to have a single place to support all kind of clip models. It is already used in clip retrieval and I propose to use it here too.

rom1504 commented 10 months ago

what do you think @mehdidc ?

mehdidc commented 10 months ago

@rom1504 I think this is super cool! it makes very much sense to have a separate package for that. Currently, there are two ways to specifiy which model to use, first by providing --model and --pretrained and --model_type separately, or by using --pretrained_model, where we provide both model type and pretrained together separeted by comma (https://github.com/LAION-AI/CLIP_benchmark/blob/main/clip_benchmark/cli.py#L128). If --pretrained_model is used, it takes precedence over the case where --model / --pretrained. Maybe, in the case where --pretrained_model is provided, we can additionally detect and support allclip string format (i.e.,open_clip:ViT-B-32/laion2b_s34b_b79k) to make it easy to use the same string format that allclip use. What do you think?