OvidijusParsiunas / deep-chat

Fully customizable AI chatbot component for your website
https://deepchat.dev
MIT License
1.26k stars 170 forks source link

Using transformers models in deep chat #124

Closed shizheng-rlfresh closed 4 months ago

shizheng-rlfresh commented 4 months ago

I am wondering if I can use models imported from transformers.js instead of using the existing three options? I understand therequestInterceptor and responseInterceptor are sort of pre/post-processing, but they require a connect or directConnect.

Suppose I import a model/pipeline through transformers.js, then how do I use them directly in deep chat? #

OvidijusParsiunas commented 4 months ago

Hi @shizheng-rlfresh. The request handler property is perfect for this. Once you have added transformers.js to your application, all you will need to do is process the pipeline within it.

Example (Vanilla Js):

chatElementRef.request = {
  handler: async (body, signals) => {
    try {
      const pipe = await pipeline('sentiment-analysis');
      const result = await pipe(body.messages[0].text);
      signals.onResponse({text: result[0].label});
    } catch (e) {
      console.error(e);
      signals.onResponse({text: 'Failed to process pipeline'});
    }
  }
};

Let me know if you have any further questions.

shizheng-rlfresh commented 4 months ago

Thank you @OvidijusParsiunas ! Could you please show an example of using the request handler in svelte?

OvidijusParsiunas commented 4 months ago

I want to also mention that I have considered creating a special property for transfomers.js just like webModel. However the current issue is that configuration to run a model differs by the model - not by the task, hence it is very difficult and cumbersome to maintain multiple different tasks that require custom configs for different models. Another problem is that there is only one text generation model that behaves like an LLM (Xenova/Qwen1.5-0.5B-Chat), hence creating and maintaining an interface for an ecosystem that does not have much to offer for the chatbot domain doesn't currently yield much benefit to Deep Chat. Nevertheless, I may revisit this in the future.

OvidijusParsiunas commented 4 months ago

Here is some sample code for Svelte:

<script>
  import { DeepChat } from "deep-chat";
  import { pipeline } from "@xenova/transformers";
</script>

<main>
  <deep-chat
    request={{
      handler: async (body, signals) => {
        try {
          const pipe = await pipeline("sentiment-analysis");
          const result = await pipe(body.messages[0].text);
          signals.onResponse({ text: result[0].label });
        } catch (e) {
          console.error(e);
          signals.onResponse({ text: "Failed to process pipeline" });
        }
      },
    }}
  />
</main>
shizheng-rlfresh commented 4 months ago

Thank you @OvidijusParsiunas!!!

OvidijusParsiunas commented 4 months ago

I will be closing this issue since the topic of discussion has been resolved. Nevertheless feel free to comment below or create a new issue for anything else. Thanks!