Open elie222 opened 6 months ago
We use the ai package: https://sdk.vercel.ai/docs. And it's easy to switch between LLM's with it. Here's a playground for example: https://sdk.vercel.ai/
ai
Although there may be features that don't work across LLMs like function calling. So we'd have to handle those differently.
If we could support Mistral that would be very cool and help us move in the direction of a 100% self-hosted version.
can i work on this ?
Yes!
This is a game changer i guess. Especially if we can do it with LLAMA 3 dont you think? I wonder why you chose mistral, i thinker if we can add llama from groq!
We use the
ai
package: https://sdk.vercel.ai/docs. And it's easy to switch between LLM's with it. Here's a playground for example: https://sdk.vercel.ai/Although there may be features that don't work across LLMs like function calling. So we'd have to handle those differently.
If we could support Mistral that would be very cool and help us move in the direction of a 100% self-hosted version.