vercel / ai-chatbot

A full-featured, hackable Next.js AI chatbot built by Vercel
https://chat.vercel.ai
Other
5.53k stars 1.63k forks source link

Add Support for Azure OpenAI alongside existing OpenAI platform #162

Closed ArunkumarRamanan closed 7 months ago

ArunkumarRamanan commented 8 months ago

Expected Enhancement

In your current setup, your AI Chatbot is only compatible with the OpenAI platform. This setup might limit users or organizations who prefer or need to use Azure cognitive services, including Azure OpenAI. Therefore, to increase the adoption and usability of your app across diverse ecosystems, extending support for Azure OpenAI service is pivotal.

Why is this Important?

  1. Increasing Compatibility: By supporting Azure OpenAI, you can ensure that organizations entrenched in the Azure ecosystem can seamlessly integrate your solutions into their workflows.

  2. Leveraging Azure's features: Azure provides robust security, compliance, and scalability. Your solution can harness these features if you extend support to include Azure.

  3. Expanding User Base: Organizations who are keen on using Azure services might not be open to switching to other platforms for certain services. Providing compatibility with Azure OpenAI will increase your potential user base.

Proposed Solution

You can achieve this by integrating the Azure OpenAI SDK into your codebase to make API calls and handle responses. The integration process should ensure that users can switch between OpenAI and Azure OpenAI as effortlessly as possible. This may involve upgrading your current system/settings to handle more complex scenarios such as managing different tokens/credentials.

A potential roadmap could be:

  1. Research Azure OpenAI service and SDK and understand its workflow and setup
  2. Update your system's authentication service to handle Azure's credentials
  3. Update the services making the OpenAI API calls to also support making Azure API calls
  4. Ensure the responses from Azure OpenAI can be correctly interpreted by your system
  5. Run thorough testing to ensure robustness and efficiency
  6. Update system documentation and user guide to include instructions about setting up and using Azure OpenAI

Please, any help from the community would be appreciated towards implementing this feature.

Thank you.

coozywana commented 8 months ago

@ArunkumarRamanan Hi, so it seems like this project doesn't support any other LLMs other than OpenAI am I correct?

leerob commented 7 months ago

You can add any LLM thats supported through: https://sdk.vercel.ai/docs

coozywana commented 7 months ago

@leerob Hey, thank you for you response. With the sdk code for langchain, route.ts: https://sdk.vercel.ai/docs/guides/providers/langchain It doesn't seem to match the one for this repo. You can make the response, and everything work, but having it connected to the kv doesn't seem to match, and the memory formatter also seems to have errors. https://github.com/vercel/ai-chatbot/issues/103

AvinaashAnandK commented 5 months ago

@ArunkumarRamanan - The OpenAI JS package supports accessing your OpenAI models deployed on OAI Azure. I just changed the object instantiation in api/chat/route.ts and tested it out on Postman by commenting out the auth parts of the file.

const openai = new OpenAI({ apiKey: process.env.AZURE_KEY, baseURL: process.env.FULL_ENDPOINT, defaultQuery: { 'api-version': process.env.API_VERSION }, defaultHeaders: { 'api-key': process.env.AZURE_KEY }, });

Where, API_VERSION would be similar to "2023-07-01-preview" FULL_ENDPOINT would be structured as "https://{resource_name}.openai.azure.com/openai/deployments/{deployed_model_name}/"