Open Intex32 opened 1 year ago
After further assessment, I think maintaining a model structure (such as ModelType
) besides the actual models with capabilities is not worth the effort. I suggest inlining the functionality of ModelType
into the implementations of LLM
. This is necessary as the encoding and context length is not the same for all models and providers and requires some more hierarchy. Currently, ModelType
is heavily designed for OAI. This idea and it's consequences are subject of exploration in this issue.
ModelType
held following properties:
ModelType
's encoding type was used eg for summarization or adapting a prompt to the context size.
Goal: generalize the ModelType class from OpenAI specific implementations to support multiple providers
I consider
ModelType
a formal description of a model and it's properties without any capabilities. This includes eg name, contextLength.my suggestions:
tokenizer
tocore
ModelType
can be found directly in the provider specific classes that contain instances of available models; instance is passed as parameter to respective model constructor => move OpenAI instances and create new instances with correct parameters for gpt4all and GCPModelType
; GCP appears to not have public information about the encoding they are using; thus we have to make an API call to GCP to retrieve the token count of a given message; for OpenAI, the encoding is used to locally compute the token countdepends on #393