Instead of just open ai and claud ai support try to add LiteLLM( multi llm support foss solution )support to this project in such a way that we can add our local proxy server api endpoint which support either selfhosted open source llm or hosted open source like groq mistral/llama or proprietary LLM like Google Gemini and use that in this to generate prompt as per our need. I know performance might not be as good as of gpt 4 Still open source are capable to provide many times better solution than 3.5
Instead of just open ai and claud ai support try to add LiteLLM( multi llm support foss solution )support to this project in such a way that we can add our local proxy server api endpoint which support either selfhosted open source llm or hosted open source like groq mistral/llama or proprietary LLM like Google Gemini and use that in this to generate prompt as per our need. I know performance might not be as good as of gpt 4 Still open source are capable to provide many times better solution than 3.5