Closed vinamrajais closed 3 months ago
maybe proxy issue, or was openai server down?
@HenryHengZJ openAI server is fine as able to execute queries directly and from other sites who uses openAI APIs. I am not running any proxy or VPN.
Is there a way to check if nothing is wrong with flowise setup?
Its working now. Issue was with OpenAI APIs access quota, I have to buy a plan to able to use APIs.
Describe the bug Flowise OpenAI is throwing "Connection error" with normal "hello" prompt.
To Reproduce Steps to reproduce the behavior:
Created a simple Flowise APP as below![image](https://github.com/FlowiseAI/Flowise/assets/4811678/ae3a7071-3dd2-461a-84cf-87d338b5ca16)
See error
2024-03-18 12:34:20 [INFO]: ⬆️ POST /api/v1/internal-prediction/b60182b2-f13b-45ce-a8cf-1f6692fbfa03 2024-03-18 12:34:20 [INFO]: [server]: Chatflow b60182b2-f13b-45ce-a8cf-1f6692fbfa03 added into ChatflowPool 2024-03-18 12:36:10 [ERROR]: [server]: Error: Connection error. Error: Connection error. at OpenAI.makeRequest (/Users/vinamra/Documents/SW Projects/LLMs/Flowise/node_modules/.pnpm/openai@4.28.4/node_modules/openai/core.js:292:19) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async /Users/vinamra/Documents/SW Projects/LLMs/Flowise/node_modules/.pnpm/@langchain+openai@0.0.14/node_modules/@langchain/openai/dist/chat_models.cjs:650:29 at async RetryOperation._fn (/Users/vinamra/Documents/SW Projects/LLMs/Flowise/node_modules/.pnpm/p-retry@4.6.2/node_modules/p-retry/index.js:50:12)
Expected behavior Connection should be established successfully
Setup
Running the app as dev environment with
Flowise Version
flowise:build: > flowise@1.6.1 build /Users/vinamra/Documents/SW Projects/LLMs/Flowise/packages/server
OS: macOS
Browser: Chrome
I have tried some more ChatFlows, but all returning with same Error. Created new API KEY in OpenAI as suggested and using the same to connect.
Am I missing something very basic here?