FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
31.57k stars 16.45k forks source link

[BUG] Using Azure OpenAI LLM produces error in result parsing #938

Open sebbae opened 1 year ago

sebbae commented 1 year ago

Describe the bug Using the Azure OpenAI in simple chat scenarios produces an code-level error which is surfaces as TypeError: Cannot read properties of undefined (reading '0') in the chat as well.

At least while the chat box is still visible the actual result is displayed afterwards as (separate?) message. Note that the latter often seems pretty weird as well but this may be unrelated to this error (see screenshots)

After closing and reopening the chat window, only the error message is visible.

TypeError: Cannot read properties of undefined (reading '0')
    at NoOpOutputParser.parseResult (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/schema/output_parser.cjs:24:38)
    at NoOpOutputParser.parseResultWithPrompt (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/schema/output_parser.cjs:7:21)
    at LLMChain._getFinalOutput (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/chains/llm_chain.cjs:93:55)
    at LLMChain._call (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/chains/llm_chain.cjs:123:42)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async LLMChain.call (/usr/local/lib/node_modules/flowise/node_modules/langchain/dist/chains/base.cjs:98:28)
    at async runPrediction (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/chains/LLMChain/LLMChain.js:128:29)
    at async LLMChain_Chains.run (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/chains/LLMChain/LLMChain.js:77:21)
    at async App.processPrediction (/usr/local/lib/node_modules/flowise/dist/index.js:783:19)
    at async /usr/local/lib/node_modules/flowise/dist/index.js:558:13

To Reproduce Steps to reproduce the behavior:

  1. Create a new chat flow from template "Simple LLM Chain"
  2. Replace OpenAI LLM with Azure OpenAI
  3. Configure access credentials for LLM as per docs
  4. Save and deploy
  5. See error

Expected behavior

A single clear reply from the LLM in the chatbox as response.

Screenshots

The Azure Model is configures as follows (API instance name is dummy here)

Screenshot of API access credentials

The simple chat scenario with the error in the result looks as follows:

Screenshot of weird response

Flow If applicable, add exported flow in order to help replicating the problem.

Setup

HenryHengZJ commented 1 year ago

which model did you deploy on Azure? Older deprecated models might not work. Here's the docs - https://docs.flowiseai.com/chat-models/azure-chatopenai

sebbae commented 1 year ago

I tried both with deployments of 'gpt-35-turbo' as well as 'gpt-4'.

emil2intra commented 1 year ago

Still don't works

emil2intra commented 1 year ago

image

i-LUDUS commented 6 months ago

Screenshot 2024-05-12 172600 Does anyone has a working setup to use Azure OpenAI in Flowise ?