mckaywrigley / chatbot-ui

AI chat for every model.
https://chatbotui.com
MIT License
28.29k stars 7.85k forks source link

Support open-source models with OpenAI-compliant API interface #865

Closed imoneoi closed 8 months ago

imoneoi commented 1 year ago

Could you support open-source LLMs with same API interface as OpenAI's, like FastChat?

I think one viable approach would be parsing model information using /models API instead of hardcode them into OpenAIModels class. Currently, I temporarily added the fields of several open-source models into OpenAIModels to use the UI with open-source models.

mjbeverley commented 1 year ago

imeoneoi can you give a code snippet example of how you are doing this?

imoneoi commented 1 year ago

I've created a fork here, and temporarily added some open-source models, see https://github.com/mckaywrigley/chatbot-ui/compare/main...imoneoi:openchat-ui:main

I think a better solution would be fetching models from /models API, but there may be some difficulties.

  1. The /models API does not return a context limit. Maybe we need to deal with the message limit in server side? I know that OpenAI API can already handle that and return an error message.

ChatInput.tsx

    if (maxLength && value.length > maxLength) {
      alert(
        t(
          `Message limit is {{maxLength}} characters. You have entered {{valueLength}} characters.`,
          { maxLength, valueLength: value.length },
        ),
      );
      return;
    }
  1. It's tedious to add all tokenizers of open-source models. Could we also deal with long conversations on server side? BTW, does ChatGPT UI cut conversations on client-side?

Another solution could be not cutting or approximating the number of tokens using number of words if the tokenizer is unknown.

chat.ts

    for (let i = messages.length - 1; i >= 0; i--) {
      const message = messages[i];
      const tokens = encoding.encode(message.content);

      if (tokenCount + tokens.length + 1000 > model.tokenLimit) {
        break;
      }
      tokenCount += tokens.length;
      messagesToSend = [message, ...messagesToSend];
    }
mjbeverley commented 1 year ago

I been trying to get it working via https://github.com/lm-sys/FastChat/blob/main/docs/openai_api.md which is a openai API compatible end point. There is also this: https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui. I have tried both with no luck, so I was curious how you did it.

Sharpz7 commented 1 year ago

Here is another example that would be great to get support for https://github.com/chenhunghan/ialacol/issues/7

It seems for now it fails to read the models, and also fails to set the model, so I am manually adding both. Outside of that it seems to work relatively well, but manually stopping fails.

Would be great to see something like ialacol and this be merged into a single project :))