dmarx / Multi-Modal-Comparators

Unified API to facilitate usage of pre-trained "perceptor" models, a la CLIP
39 stars 4 forks source link

Ezmode #37

Closed dmarx closed 1 year ago

dmarx commented 2 years ago
# ta dah!
from mmc.ez.CLIP import clip

# mhm.
clip.available_models()

# requesting a tokenizer before loading the model
# returns the openai clip SimpleTokenizer
#tokenize = clip.tokenize

# either of these works
model, preprocessor = clip.load('RN50')
model, preprocessor = clip.load('[clip - openai - RN50]')

# if we request the tokenizer *after* a model has been loaded, 
# the tokenizer appropriate to the loaded model is returned 
tokenize = clip.tokenize
rom1504 commented 2 years ago

With the right line in the top level init, You can make it so it'll be from mmc import clip which is a bit nicer.

Can you also put that example of usage in the readme?

dmarx commented 2 years ago

oy vey, I didn't realize I left this unmerged. I'll make sure to do a solid chunk of mmc dev work this coming week, don't want this project to stagnate.

dmarx commented 1 year ago

clipfa still broken, but I think that was an issue before i even opened this commit. lgtfm