Closed clementou closed 6 months ago
@clementou I feel like it is not scalable to add more models to LLM_Name
. Since there are libraries like LiteLLM which converts all models to a uniform api, we can should allow using custom endpoints and not overflow our code with a long list of LLMs we support.
@lwaekfjlk @ruiyiw @Jasonqi146 @sharonwx54 worked on adding a custom model which could support all possible endpoints. There's no need to do repetitive efforts. Could y'all share how to use your custom_model
option?
Thanks @ProKil! @clementou Feel free to chat with people to figure out their solutions! You can directly post Qs in the sotopia-technical
channel (or simply communicate here; i just find people usually don't check this haha)
Description
TogetherAI models:
Hugging Face models:
Tsinghua:
FastChat:
Nvidia:
[ ] Llama2-70B-SteerLM-Chat
[ ] rwkv-6-eagle
Proprietary:
Anthropic:
Google:
Mistral:
Perplexity:
Additional Information
No response