we currently only support text-embedding-ada-002, but we should support all of them. We can do this similar in fashion to how we support all of the hugging face "sentence transformers".
Will need to decide how we want to flag a model as "openai" vs "sentence transformer" though. This could be with a namespace or org prefix. e.g.
transformer => 'openai/text-embedding-ada-002`
this would make huggingface and openai interfaces consistent. and we could make this backwards compatible for a while by still routing transformer => 'text-embedding-ada-002' to OpenAI.
we currently only support
text-embedding-ada-002
, but we should support all of them. We can do this similar in fashion to how we support all of the hugging face "sentence transformers".Will need to decide how we want to flag a model as "openai" vs "sentence transformer" though. This could be with a namespace or org prefix. e.g.
this would make huggingface and openai interfaces consistent. and we could make this backwards compatible for a while by still routing
transformer => 'text-embedding-ada-002'
to OpenAI.