Closed the-trading-ai closed 4 months ago
Hey @the-trading-ai
If you open the LLM chain is there a larger error in there? Can you also share the workflow json so we can use it to reproduce the issue?
Here's the process : (Someone told me that It look that this problem happened yesterday in langchain community with the Huggingface library)
TypeError: message.toJSON is not a function
at /usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:168:73
at Array.map (
ok, it look that is a general issue. Probably from the latest version. I updated yesterday my version to the latest one
Same problem here in using any type o Summarization Chain (1.29.1) :
TypeError: message.toJSON is not a function
at /usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:168:73
at Array.map (
I'm also seeing the same error on n8n@1.29.1:
TypeError: message.toJSON is not a function
at /usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:168:73
at Array.map (<anonymous>)
at Proxy.connectionType (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/logWrapper.js:168:48)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at Proxy._generateUncached (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/node_modules/@langchain/core/dist/language_models/llms.cjs:138:22)
at LLMChain._call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/llm_chain.cjs:157:37)
at LLMChain.call (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/node_modules/langchain/dist/chains/base.cjs:120:28)
at createSimpleLLMChain (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:84:23)
at getChain (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:93:16)
at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:360:31)
^ This issue affects all LLM function -- including OpenAI (not just HuggingFace).
Pretty sure this is the line that the error is referring to: https://github.com/n8n-io/n8n/blob/8c14ca7ad1525df73d762be2c4702ecf8a118414/packages/%40n8n/nodes-langchain/utils/logWrapper.ts#L265
This code was introduced by this commit: https://github.com/n8n-io/n8n/commit/7501ad8f3c56d9fcc5f4ec3d6fc468ab9cdb5024
As part of this PR: https://github.com/n8n-io/n8n/pull/8526
I'm going to upgrade to n8n@1.30.0 and see if that fixes the issue.
We will look into this in the morning and potentially put out a new released for now though you can go back to a previous release and you should be good to go.
Getting a different error on n8n@1.30.0 - FYI:
Error: Could not get parameter
at getNodeParameter (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/NodeExecuteFunctions.js:1513:15)
at Object.getNodeParameter (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/NodeExecuteFunctions.js:2189:24)
at getPromptInputByType (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/helpers.js:29:24)
at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:396:61)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at Workflow.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-workflow/dist/Workflow.js:730:19)
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:662:53
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:1064:20
Ironically, the Conversational Agent
node works in n8n@1.30.0 -- it's just the Basic LLM Chain
node that appears to be broken.
Confirmed that this issue is not present in n8n@1.27.3 -- it does presently exist in all later versions.
Fix got released with n8n@1.30.1
Good news, This should now be resolved. I am going to mark this as closed and if you are still seeing this issue let me know.
@Joffcom , no dice. I'm still seeing these errors using the Basic LLM Chain
node in n8n@1.30.1. CC: @janober
Error: Could not get parameter
at getNodeParameter (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/NodeExecuteFunctions.js:1514:15)
at Object.getNodeParameter (/usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/NodeExecuteFunctions.js:2190:24)
at getPromptInputByType (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/utils/helpers.js:29:24)
at Object.execute (/usr/local/lib/node_modules/n8n/node_modules/@n8n/n8n-nodes-langchain/dist/nodes/chains/ChainLLM/ChainLlm.node.js:396:61)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at Workflow.runNode (/usr/local/lib/node_modules/n8n/node_modules/n8n-workflow/dist/Workflow.js:730:19)
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:660:53
at /usr/local/lib/node_modules/n8n/node_modules/n8n-core/dist/WorkflowExecute.js:1062:20
^ This workflow works fine on n8n@1.27.3
Hey @Joffcom and @janober , I think I sort of understand the issue.
This error appears only for legacy workflows and legacy nodes. If I create a brand new workflow with the same identical nodes, this error doesn't exist.
This tells me that there's some backwards compatibility issue that's breaking older workflows/nodes.
Steps to reproduce:
Basic LLM Chain
workflow in n8n@1.27.3As it stands right now, if there's no way to "auto-upgrade" legacy workflows/nodes to support new 1.30.1+ features, then all your existing users will be experiencing this same pain along the way. I'm not sure how to solve this issue, but FYI.
Recreating the simple Basic LLM Chain
workflow has no issues:
Oh, interesting... the newer Basic LLM Chain
node in n8n@1.30.1 does not have the Output Parser
optional connector any more -- I don't know if that's intended or a possible secondary regression, @Joffcom . Compare the past couple of screenshots and you'll see what I mean.
Yup, confirmed. In n8n@1.30.1+, I had to manually regenerate the Basic LLM Chain
node to get it to work. Here's the side-by-side comparison:
@dkindlund the good news is that sounds like a different issue so I believe the original issue here is solved and there is another one to look into.
You also don't need to worry about tagging us if we are commenting we will get notifications on new posts we are also mainly around during Berlin office hours 🙂
We will look into this new issue on Monday morning. It may be worth opening a new issue so it doesn't get confused with the original issue here.
Bug Description
When I connect a Chat with the LLM Chain Node and use Hugging Face, i got an error message : message.toJSON is not a function
To Reproduce
Expected behavior
The message form the model
Operating System
hosted
n8n Version
Latest 1.29
Node.js Version
hosted
Database
SQLite (default)
Execution mode
main (default)