This adds support for Anthropic as an alternative to OpenAI for the LLM. It's a good basis for adding other AI's such as Mistral too. The call to the LLM has been encapsulated in a new function. That function handles calling the correct LLM based on the user's preferences.
What still has to happen for this to go live in the public facing app. But self-hosters can already use this feature today by setting the correct model/provider in their database, and the AI key in the .env file:
[ ] Store correct costs for the different LLMs.
[ ] Get function calling with Anthropic to work. This is used in one place in the app.
[ ] Allow user to switch their LLM provider/model on the settings page.
This adds support for Anthropic as an alternative to OpenAI for the LLM. It's a good basis for adding other AI's such as Mistral too. The call to the LLM has been encapsulated in a new function. That function handles calling the correct LLM based on the user's preferences.
What still has to happen for this to go live in the public facing app. But self-hosters can already use this feature today by setting the correct model/provider in their database, and the AI key in the
.env
file: