run-llama / chat-llamaindex

https://chat.llamaindex.ai
MIT License
849 stars 261 forks source link

Supported LLM: Azure OpenAI? #52

Open jagadishdoki opened 9 months ago

jagadishdoki commented 9 months ago

You have indicated that ChatGPT-Next-Web project was used as a starter template for this project. Can you please confirm if LlamaIndex Chat support Azure OpenAI?

If yes, please provide the instructions to switch to Azure OpenAI. If no, will this be treated as feature enhancement? Is there a quick way to make this switch to use Azure OpenAI?

Content of .env.development.local file

Your openai api key. (required)

OPENAI_API_KEY=sk-xxxx

marcusschiesser commented 9 months ago

Chat LlamaIndex can use any LLM that is supported by LlamaIndexTS, you just have to plug it in here: https://github.com/run-llama/chat-llamaindex/blob/86aca07c475414abe2f3c46964294e654d7fc78b/app/api/llm/route.ts#L149-L154

destinychanger commented 9 months ago

Chat LlamaIndex can use any LLM that is supported by LlamaIndexTS, you just have to plug it in here:

https://github.com/run-llama/chat-llamaindex/blob/86aca07c475414abe2f3c46964294e654d7fc78b/app/api/llm/route.ts#L149-L154

Hi, Below are the config I have added as asked: azure: { apiKey: "xxxxxxxxx", endpoint: "https://azureopenaillamaindex.openai.azure.com/openai/deployments/gpt-35-turbo/chat/completions?api-version=2023-07-01-preview", apiVersion: "2023-07-01-preview", deploymentName:"gpt-35-turbo" , }

And I am getting below error:

[LlamaIndex] BadRequestError: 400 Unsupported data type

   at APIError.generate (C:\LAAMA\chat-llamaindex\node_modules\openai\error.js:43:20)    at AzureOpenAI.makeStatusError (C:\LAAMA\chat-llamaindex\node_modules\openai\core.js:252:33)    at AzureOpenAI.makeRequest (C:\LAAMA\chat-llamaindex\node_modules\openai\core.js:293:30)    at process.processTicksAndRejections (node:internal/process/task_queues:95:5) { status: 400, headers: {    'apim-request-id': 'df10be42-1577-481f-b65e-d1abb4f43bc9',    'content-length': '22',    'content-type': 'text/plain; charset=utf-8',    date: 'Tue, 19 Dec 2023 07:14:23 GMT',    'ms-azureml-model-error-reason': 'model_error',    'ms-azureml-model-error-statuscode': '400',    'strict-transport-security': 'max-age=31536000; includeSubDomains; preload',    'x-content-type-options': 'nosniff',    'x-ms-client-request-id': 'df10be42-1577-481f-b65e-d1abb4f43bc9',    'x-ms-region': 'East US',    'x-ratelimit-remaining-requests': '119',    'x-ratelimit-remaining-tokens': '119984',    'x-request-id': '23112d7f-5f0f-4d28-8673-a45c99c69cd3' }, error: undefined, code: undefined, param: undefined, type: undefined

And on UI it shows:

llama error

Can you please help me suspecting what could be the issue.

marcusschiesser commented 9 months ago

Sorry, currently we're not having an azure example. I would start using this example https://github.com/run-llama/LlamaIndexTS/blob/main/examples/openai.ts and modify the parameters