Closed spacex-3 closed 10 months ago
assuming your models are OpenAI compatible, this is possible. You need to add openai/
as a prefix to the model name
set model = openai/gpt-4-mobile
@spacex-3 does this fix your problem ?
@spacex-3 does this fix your problem ? My models are OpenAI compatible, such as gpt-4-abc But it's ok I transfer the model name to the Openai campatible one.
Thanks!
What happened?
repo:iuiaoin/wechat-gptbot I used the above project to integrate with the WeChat bot, but I need to used the 'gpt-4-mobile' or 'gpt-4-s model' from PandoraNext. Currently, it seems that the inability to use it is due to the lack of support for such custom models by LITellm.
Relevant log output
Twitter / LinkedIn details
No response