Open aditya-shri opened 9 months ago
Most selfhosted api servers need model
parameter in their requests, which differs from Openai model names. Hence, with current architecture, I cannot use any selfhosted server. One possible solution is to pass model
parameter as an environment variables.
Added support for using self hosted AI models. The self-hosted AI models should support OpenAI API calls.