lm-sys / FastChat

An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
Apache License 2.0
36.96k stars 4.56k forks source link

Support Bailing LLM from ALIPAY #3487

Open cuauty opened 2 months ago

cuauty commented 2 months ago

We hope make our LLM named BaiLing become the one of the optional LLM on the chat.lmsys.org and join the chat on the website. We setup our own http end-point for LLM's reasoning and Bailing LLM has already been compatible with openai client and I have passed the test based on the FastChat document on my local environment.

I can setup PR to submit my code to fastchat/serve/api_provider.py. Is that all I need to do? Thank you.

jfdimgo commented 1 month ago

this PR: #3543

cuauty commented 1 month ago

this PR: #3543

@jfdimgo

We have the LLM named "Bailing" (with lower case "l" inside) and setup the endpoint for API-based reasoning, and any comments is welcome.

Could you help point out what else we need to do next? Please fell free to contact us if you have any questions. Thanks a lot.