Closed TechnologyClassroom closed 3 months ago
The industry standard, even with open-source projects tends to lean towards cloud AI unless they are processing a huge amount of requests. I'm using Vercel AI SDK and you could switch to an alternative model or provider, just not a self-hosted version. It won't be efficient for a small project like this to have a self-hosted LLM when costs are very low for AI.
Self-hosted LLM technology has come a long way. Have you looked into replacing OpenAI with one of the local alternatives?