Yonom / assistant-ui

React Components for AI Chat 💬 🚀
https://www.assistant-ui.com
MIT License
391 stars 26 forks source link

Integrate self-hosted CHAT_API #576

Closed RoadToDev101 closed 1 month ago

RoadToDev101 commented 1 month ago

Hi team,

I have an API for streaming response. Can I use the UI to integrate to it? Sorry for my bad English

Here is an example of calling the chat api on my Nextjs app.

import { useChat } from 'ai/react'

export default function ChatBox() {
  const {
    messages,
    input,
    handleInputChange,
    handleSubmit,
    isLoading,
    stop,
    reload,
    error,
  } = useChat({
    api: process.env.NEXT_PUBLIC_CHAT_API,
    headers: {
      'Content-Type': 'application/json',
    },
    streamMode: 'text',
    onFinish(message) {
      console.log(message.content)
    },
  })

  ...
Yonom commented 1 month ago

Of course! We have first class support for the Vercel AI SDK!

npm i @assistant-ui/react-ai-sdk
import { useChat } from 'ai/react'
import { useVercelUseChatRuntime } from "@assistant-ui/react-ai-sdk";
import { Thread } from "@assistant-ui/react";

export default function ChatBox() {
  const chat = useChat({
    api: process.env.NEXT_PUBLIC_CHAT_API,
    headers: {
      'Content-Type': 'application/json',
    },
    streamMode: 'text',
    onFinish(message) {
      console.log(message.content)
    },
  })

  const {
    messages,
    input,
    handleInputChange,
    handleSubmit,
    isLoading,
    stop,
    reload,
    error,
  } = chat;

  const runtime = useVercelUseChatRuntime(chat);

  <Thread runtime={runtime} />  or <AssistantRuntimeProvider runtime={runtime} />
  ...

Please let me know if you have any more questions - otherwise I will be closing this issue within 24 hours. Feel free to join our Discord too :)

RoadToDev101 commented 1 month ago

Thank you for response @Yonom, I am currently can not try it right now, but I will try on Monday (GMT+7) 😁. Anyway, great works team!

RoadToDev101 commented 1 month ago

Hi @Yonom, tried it and it was amazing! It look nearly similar to my UI but better. It will amazing if you could implement the loading spinning, stop and resend message feature.

Yonom commented 1 month ago

@RoadToDev101 all the features you mentioned should work. Can you tell me if you're importing Thread from @assistant-ui/react or from @/components/ui/assistant-ui/thread?

In case it's from @assistant-ui/react, this is a bug - please let me know!

In case it's from @/components/ui/assistant-ui/thread; you can add all the missing functionality by running:

npx assistant-ui add thread-full

In case that doesn't work, this is a bug - please let me know!

RoadToDev101 commented 1 month ago

@Yonom I am using @/components/ui/assistant-ui/thread. Thanks man, just do what you instruct and I saw the full features!

RoadToDev101 commented 1 month ago

But there is a bug here, I think it is counting the words of the response. @Yonom

image

image

RoadToDev101 commented 1 month ago

@Yonom I also ran into the issue with the response overflow the input section

image

Yonom commented 1 month ago

uh oh! I'll look into this

Yonom commented 1 month ago

@RoadToDev101 Overflow input transparency: Please add a bg-inherit style to Thread.Viewport example diff: https://github.com/Yonom/assistant-ui/commit/833d89b564f86f1ca3882ff9cf2eaded0c57c3b8

Yonom commented 1 month ago

@RoadToDev101 the reason why every token is being split to its own message is because the ID that the AI SDK provides changes on each update

This only happens when streamMode: "text" is set...

Screenshot 2024-07-31 at 14 52 15

I'll file a bug for the AI SDK...

Yonom commented 1 month ago

@RoadToDev101

in the meantime, either

"use client";

import { ChatModelAdapter, useLocalRuntime } from "@assistant-ui/react";
import { Thread } from "@assistant-ui/react-ui";

export function asAsyncIterable<T>(
  source: ReadableStream<T>
): AsyncIterable<T> {
  return {
    [Symbol.asyncIterator]: () => {
      const reader = source.getReader();
      return {
        async next(): Promise<IteratorResult<T, undefined>> {
          const { done, value } = await reader.read();
          return done
            ? { done: true, value: undefined }
            : { done: false, value };
        },
      };
    },
  };
}

const MyCustomAdapter: ChatModelAdapter = {
  async *run({ messages, abortSignal }) {
    const messagesToSend = messages.map((m) => ({
      role: m.role,
      content: m.content
        .filter((c) => c.type === "text")
        .map((c) => c.text)
        .join(" "),
    }));

    const response = await fetch("/api/chat", {
      method: "POST",
      headers: {
        "Content-Type": "application/json",
      },
      body: JSON.stringify({
        messages: messagesToSend,
      }),
     signal: abortSignal,
    });

    let message = {
      content: [
        {
          type: "text" as const,
          text: "",
        },
      ],
    };

    for await (const chunk of asAsyncIterable(
      response.body!.pipeThrough(new TextDecoderStream())
    )) {
      message = {
        content: [{ type: "text", text: message.content[0].text + chunk }],
      };
      yield message;
    }
  },
};

export default function Home() {
  const runtime = useLocalRuntime(MyCustomAdapter);
  ...
}