Closed Std1234 closed 1 year ago
NOT REALLY WORKING SAME PROBLEM
On Wed, Jun 7, 2023 at 4:35 AM Github @.***> wrote:
Need Help, when I enter the URL and start chatting with the bot it gives the error I tried the same website URL on openchat.so, it's [image: Screenshot 2023-06-07 123138] https://user-images.githubusercontent.com/107894082/243981011-27023c57-f2fc-4b73-acfb-5a59a091d42f.png working fine there.
— Reply to this email directly, view it on GitHub https://github.com/openchatai/OpenChat/issues/37, or unsubscribe https://github.com/notifications/unsubscribe-auth/BALKZ5PBKGXBF7WMOIC3AFLXKA4MBANCNFSM6AAAAAAY5QNS5A . You are receiving this because you are subscribed to this thread.Message ID: @.***>
make sure you llm-server is working, you can check its logs
make sure you llm-server is working, you can check its logs
How to check, please help
LLM server working
@Std1234 check your docker logs for the llm-server, particularly after you send a test message.
I believe there is a breaking change in later versions of langchainjs but not sure which, I'm also getting this behavior with the following error:
error TypeError: chatMessage._getType is not a function
at file:///usr/src/app/node_modules/langchain/dist/chains/conversational_retrieval_chain.js:67:33
at Array.map (<anonymous>)
at ConversationalRetrievalQAChain.getChatHistoryString (file:///usr/src/app/node_modules/langchain/dist/chains/conversational_retrieval_chain.js:66:18)
at ConversationalRetrievalQAChain._call (file:///usr/src/app/node_modules/langchain/dist/chains/conversational_retrieval_chain.js:90:60)
at ConversationalRetrievalQAChain.call (file:///usr/src/app/node_modules/langchain/dist/chains/base.js:65:39)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async handler (webpack-internal:///(api)/./pages/api/chat.ts:53:26)
at async Object.apiResolver (/usr/src/app/node_modules/next/dist/server/api-utils/node.js:372:9)
at async DevServer.runApi (/usr/src/app/node_modules/next/dist/server/next-server.js:513:9)
at async Object.fn (/usr/src/app/node_modules/next/dist/server/next-server.js:815:35)
at async Router.execute (/usr/src/app/node_modules/next/dist/server/router.js:243:32)
at async DevServer.runImpl (/usr/src/app/node_modules/next/dist/server/base-server.js:432:29)
at async DevServer.run (/usr/src/app/node_modules/next/dist/server/dev/next-dev-server.js:814:20)
at async DevServer.handleRequestImpl (/usr/src/app/node_modules/next/dist/server/base-server.js:375:20)
at async /usr/src/app/node_modules/next/dist/server/base-server.js:157:99
I think it's coming from the way `history` is passed into the chain at pages/api/chat.ts:47, I think it's getting a string array instead of a historymessage type, which wouldn't have _getType on it.
I narrowed it down to this commit, released with langchain 0.0.86: https://github.com/hwchase17/langchainjs/compare/0.0.86...main#diff-4d0f479be12d9b217277a26941bcfe72cfb9bc6b9bb9209099d6dd3294248bfbR581
If you change your llm-server/package.json to: "langchain": "0.0.85",
It should work. I don't know the correct way to refactor the existing implementation to get around this for higher versions though, haven't worked with JS/TS before.
I narrowed it down to this commit, released with langchain 0.0.86: hwchase17/langchainjs@0.0.86...main#diff-4d0f479be12d9b217277a26941bcfe72cfb9bc6b9bb9209099d6dd3294248bfbR581
If you change your llm-server/package.json to: "langchain": "0.0.85",
It should work. I don't know the correct way to refactor the existing implementation to get around this for higher versions though, haven't worked with JS/TS before.
I tried but still not working 😞
Did you run
npm update lanchain
inside llm-server, and then make install
at the root? I'd verify it's using .85 in the package.json.lock file too.
Need Help, when I enter the URL and start chatting with the bot it gives the error I tried the same website URL on openchat.so, it's working fine there.