There are a number of other LLM providers that are compatible with the OpenAI API specification such as LocalAI. These providers can be used with Token.js by setting the baseUrl field and using the OpenAI provider. However, this behavior isn't documented and there is not great type support for it.
Solution
Token.js should officially support using any OpenAI API compatible provider. We should do this by creating a new provider and handler openai-compatible where the model field is a generic string. This provider should error if a baseUrl value is not provided. This provider should reuse any logic possible from the OpenAI handler for it's implementation.
Problem
There are a number of other LLM providers that are compatible with the OpenAI API specification such as LocalAI. These providers can be used with Token.js by setting the
baseUrl
field and using the OpenAI provider. However, this behavior isn't documented and there is not great type support for it.Solution
Token.js should officially support using any OpenAI API compatible provider. We should do this by creating a new provider and handler
openai-compatible
where the model field is a genericstring
. This provider should error if abaseUrl
value is not provided. This provider should reuse any logic possible from the OpenAI handler for it's implementation.