Closed MichaelKim39 closed 1 year ago
If I keep all my dependencies locked to what is originally in package-lock.json, everything seems to work.
However, if I update vercel's ai to beyond ai@2.1.10, I get the hanging issue. It seems like something with the ai callbacks is not working. ai@2.1.13 moves to a new way of doing callbacks that the latest langchain requires: https://github.com/vercel-labs/ai/commit/8bf637adb03c61526ae5fa9d01357652662e539a
There's a lot of work the ai module is doing to try to handle all matters of callbacks: https://github.com/vercel-labs/ai/issues/205
Thanks @ejc3! I will try that out
Unrelated to this bug, I feel like having having both langchain and vercel's ai libraries makes it more convoluted. Maybe in the future just one of the libraries could be used to make code easier to follow and less brittle.
Thanks for the feedback @ejc3, we'll definitely look into it and clean up the code.
Appreciate your input!
After following all the instructions to set-up and entering a prompt via the UI, I see an infinite loading screen in the
QAModal
.In the console I can see the logs generated from the API request to Open AI, and it seems to exit the LLM successfully (I see
[llm/end] [1:llm:openai] [2.21s] Exiting LLM run with output: { "generations": [...] }
)However no consequent logs in
route.ts
appear after thechain.call(...)
, leading me to believe that the request is hanging.Why might this be the case?