Open cuauty opened 2 months ago
this PR: #3543
this PR: #3543
@jfdimgo
We have the LLM named "Bailing" (with lower case "l" inside) and setup the endpoint for API-based reasoning, and any comments is welcome.
Could you help point out what else we need to do next? Please fell free to contact us if you have any questions. Thanks a lot.
We hope make our LLM named BaiLing become the one of the optional LLM on the chat.lmsys.org and join the chat on the website. We setup our own http end-point for LLM's reasoning and Bailing LLM has already been compatible with openai client and I have passed the test based on the FastChat document on my local environment.
I can setup PR to submit my code to fastchat/serve/api_provider.py. Is that all I need to do? Thank you.