FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
29.39k stars 15.19k forks source link

[BUG]Retriever Tool #2308

Closed nag-24 closed 4 months ago

nag-24 commented 4 months ago

Describe the bug A clear and concise description of what the bug is. Retriever Tool fetches the matching docs correctly from Pinecone's database but the retrievals are not sent correctly to LLM .

To Reproduce Steps to reproduce the behavior: See Langsmith trace where the correct document is retrieved but the tooloutput is sent to LLM as "toolOutput": "\n\n\n\n\n\n" https://smith.langchain.com/public/6595062e-ac4f-4d30-85aa-e36b809c8d57/r

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior A clear and concise description of what you expected to happen.

Screenshots If applicable, add screenshots to help explain your problem.

Flow If applicable, add exported flow in order to help replicating the problem. Screenshot 2024-04-30 223530 chatflow

Setup

Additional context Add any other context about the problem here. The LLM responses works fine when accessed through Jupyter notebook using Langchain Python code. See screenshot. Screenshot of jupyter notebook output

nag-24 commented 4 months ago

also check out this Langmsith trace using Conversational Retrieval QA chain. https://smith.langchain.com/public/dc94321b-f83d-49bd-8fca-4d308daf52ee/r, the retrieval is again correct but sent as "context": "undefined\nundefined\nundefined\nundefined"

nag-24 commented 4 months ago

also check out this Langmsith trace using Conversation agent https://smith.langchain.com/public/b54e85e9-3313-4954-ba94-f0947aa2e32c/r , Received tool input did not match expected schema

Error: Received tool input did not match expected schema at DynamicStructuredTool.call (/usr/src/node_modules/.pnpm/@langchain+core@0.1.57/node_modules/@langchain/core/dist/tools.cjs:68:19) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async /usr/src/packages/components/dist/src/agents.js:315:39 at async Promise.all (index 0) at async AgentExecutor.call (/usr/src/packages/components/dist/src/agents.js:302:30) at async AgentExecutor.invoke (/usr/src/node_modules/.pnpm/langchain@0.1.33@aws-crypto+sha256-js@5.2.0@aws-sdk+client-bedrock-runtime@3.422.0@aws-sdk_4nagnmezyqrtxortipronoidfi/node_modules/langchain/dist/chains/base.cjs:58:28) at async ConversationalAgent_Agents.run (/usr/src/packages/components/dist/nodes/agents/ConversationalAgent/ConversationalAgent.js:119:19) at async utilBuildChatflow (/usr/src/packages/server/dist/utils/buildChatflow.js:243:15) at async createInternalPrediction (/usr/src/packages/server/dist/controllers/internal-predictions/index.js:7:29)

nag-24 commented 4 months ago

It worked now when I changed the metadata field in my pinecone index from "content" to "text". See this langsmith trace. https://smith.langchain.com/public/078984df-3237-4275-845a-cc633291fc2d/r. I think the issue is fixed now.