OvidijusParsiunas / deep-chat

Fully customizable AI chatbot component for your website
https://deepchat.dev
MIT License
1.26k stars 170 forks source link

Feature Request: Include Azure OpenAI support for OpenAI service #144

Closed rodneyviana closed 3 months ago

rodneyviana commented 3 months ago

Thank you for your control. It is amazing! The best in the category. I don't want to sound ungrateful, but I have a feature request. I am using it to enable Azure OpenAI using the handlers just because the endpoints for Azure OpenAI are different from OpenAI's. It's working but it was a lot of work to understand the component's inner workings and make sense of the documentation. I still trying to make stream works, but I guess I am close.

I was checking the source code of the services, and this change would be mostly to let you choose a different endpoint (and API version) via component props. The remaining code should stay the same.

For instance, changing the endpoint url to be a property rather than hardcoded.

//Make the line below be a parameter to allow Azure OpenAI
 url = 'https://api.openai.com/v1/chat/completions'; 
shahbazsyed commented 3 months ago

@rodneyviana I am facing the same hurdle at the moment. Would it be possible for you to share the handler code you used to make it work with Azure OpenAI? Thanks!

And yes, this would be an awesome feature to increase its adoption also in enterprise projects. Plus one to this feature!

rodneyviana commented 3 months ago

@shahbazsyed,

This is my implementation:

        <DeepChat
          style={{
            borderRadius: "10px",
            borderColor: "#e4e4e4",
            background:
              "linear-gradient(90deg, rgb(239, 242, 247) 0%, 7.60286%, rgb(237, 240, 249) 15.2057%, 20.7513%, rgb(235, 239, 248) 26.297%, 27.6386%, rgb(235, 239, 248) 28.9803%, 38.2826%, rgb(231, 237, 249) 47.585%, 48.1216%, rgb(230, 236, 250) 48.6583%, 53.1306%, rgb(228, 236, 249) 57.6029%, 61.5385%, rgb(227, 234, 250) 65.4741%, 68.7835%, rgb(222, 234, 250) 72.093%, 75.7603%, rgb(219, 230, 248) 79.4275%, 82.8265%, rgb(216, 229, 248) 86.2254%, 87.8354%, rgb(213, 228, 249) 89.4454%, 91.8605%, rgb(210, 226, 249) 94.2755%, 95.4383%, rgb(209, 225, 248) 96.6011%, 98.3005%, rgb(208, 224, 247) 100%)",
          }}
          chatStyle={{ width: "100%", minWidth: "600px", height: "800px" }}
          messageStyles={{
            default: {
              ai: {
                bubble: {
                  maxWidth: "100%"
                }
              },
              }
            }
          }
          request={{
            url: getChatEndpoint(),
            method: "POST",
            headers: {
              "api-key": getChatKey(),
              "Content-Type": "application/json",
            },
            handler: async (body, signals) => {
              try {
                body.messages.forEach((value: { role: string, text: string}) => {
                  messages.push({ role: value.role, content: value.text });
                });
                initialMessages.push({
                  role: body.messages[0].role,
                  text: body.messages[0].text,
                });
                const fullBody = {
                  max_tokens: 4000,
                  temperature: 0.6,
                  top_p: 1,
                  stop: null,
                  messages
                };
                const response = await fetch(
                  getChatEndpoint(),
                  {
                    method: "POST",
                    headers: {
                      "api-key": getChatKey(),
                      "Content-Type": "application/json",
                    },
                    body: JSON.stringify(fullBody),
                  }
                );
                const json = await response.json();
                if (json.error) {
                  signals.onResponse({ error: json.error.message });
                  return;
                }
                // displays the response

                const JSONResponse = json.choices[0];
                if (JSONResponse.finish_reason !== "stop") {
                  signals.onResponse({
                    error: `Error ${JSONResponse.finish_reason}`,
                  }); // displays an error message
                  return;
                }
                messages.push({
                  role: JSONResponse.message.role,
                  content: JSONResponse.message.content,
                });
                this.setState({ messages });
                initialMessages.push({
                  role: JSONResponse.message.role,
                  text: JSONResponse.message.content,
                });
                signals.onResponse({
                  text: JSONResponse.message.content,
                  role: 'ai' // JSONResponse.message.role,
                }); // displays the response
              } catch (e) {
                signals.onResponse({ error: "Error" }); // displays an error message
              }
            },
          }}
          textInput={{ placeholder: { text: "Ask me questions about the document..." } }}
          initialMessages={initialMessages}
        />
shahbazsyed commented 3 months ago

@rodneyviana Thanks a lot!

OvidijusParsiunas commented 3 months ago

Hi @rodneyviana.

If you want to use the directConnection services with a custom url you can do this by defining both: your directConnection property and the request property with a url.

E.g. in this example the custom OpenAI configuration will use the 'custom-url' for its communication:

<DeepChat
  request={{
    url: 'custom-url',
  }}
  directConnection={{
    openAI: {
      chat: true,
      key: 'mock-key',
    },
  }}
/>

It is important to note that this will not work for all services - more specifically the ones that need to call multiple URLs to complete a request such as openAI Assistant as providing an API to override the URLs would be too cumbersome.

Hopefully this helps you with your request, let me know if I have perhaps misunderstood anything to make sure everything is clear.

I also noticed that in your example code - as well as defining the handler - you define all other details inside the request property. Usually the handler is all you need and other properties will not really effect it, unless you set stream or websocket to true.

OvidijusParsiunas commented 3 months ago

A note on using the handler function. I can definitely sympathize on the fact that it is not easy to implement. Its core purpose is to allow developers to define their custom request logic and not necessarily rely on what we have out of the box. Ofcourse, with high customization comes high complexity, so it is a tradeoff that we have to make.

When defining custom implementations that are very similar to directConnection services, we encourage developers to look at the code we use here to help them get started. Thankyou very much for providing your handler example!

Another alternative we offer is the use of the request property along with request and response interceptors, which should help developers to avoid the use of handler as much as possible.

Ofcourse the usual long-term solution for a production based chat is to have the requests go though your own server for security purposes. Therefore we try to offer pre-made server solutions to help developers get started quicker here.

Your feedback is very important and we always take it into consideration when improving Deep Chat! Thanks again!

rodneyviana commented 3 months ago

Thank you. Very important pointers. I will try the directConnection first and will let you know. Also the project I am writing, a chat bot for document analysis in SharePoint Online, will be open source as soon as I get it right and I will add the link here. Feel free to use the implementation as example in your documentation to help other users.

OvidijusParsiunas commented 3 months ago

Thankyou @rodneyviana!

rodneyviana commented 3 months ago

I published the first version of the SharePoint command extender before I start the experiences. The project is here spfx-azure-ai-chat

Next step is try directConnection and have it work in stream mode

rodneyviana commented 3 months ago

Hi @rodneyviana.

If you want to use the directConnection services with a custom url you can do this by defining both: your directConnection property and the request property with a url.

E.g. in this example the custom OpenAI configuration will use the 'custom-url' for its communication:

<DeepChat
  request={{
    url: 'custom-url',
  }}
  directConnection={{
    openAI: {
      chat: true,
      key: 'mock-key',
    },
  }}
/>

It is important to note that this will not work for all services - more specifically the ones that need to call multiple URLs to complete a request such as openAI Assistant as providing an API to override the URLs would be too cumbersome.

Hopefully this helps you with your request, let me know if I have perhaps misunderstood anything to make sure everything is clear.

I also noticed that in your example code - as well as defining the handler - you define all other details inside the request property. Usually the handler is all you need and other properties will not really effect it, unless you set stream or websocket to true.

@OvidijusParsiunas, it worked like a charm.

OvidijusParsiunas commented 3 months ago

I'm happy to hear it works!! I just checked out your awesome project and it looks amazing!! :rocket: :rocket: :rocket: :rocket: