a16z-infra / companion-app

AI companions with memory: a lightweight stack to create and host your own AI companions
https://ai-companion-stack.com/
MIT License
5.76k stars 959 forks source link

Call to `chain.call` in `route.ts` is hanging, causing infinite loading #64

Closed MichaelKim39 closed 1 year ago

MichaelKim39 commented 1 year ago

After following all the instructions to set-up and entering a prompt via the UI, I see an infinite loading screen in the QAModal.

In the console I can see the logs generated from the API request to Open AI, and it seems to exit the LLM successfully (I see [llm/end] [1:llm:openai] [2.21s] Exiting LLM run with output: { "generations": [...] })

However no consequent logs in route.ts appear after the chain.call(...), leading me to believe that the request is hanging.

Why might this be the case?

ejc3 commented 1 year ago

If I keep all my dependencies locked to what is originally in package-lock.json, everything seems to work.

However, if I update vercel's ai to beyond ai@2.1.10, I get the hanging issue. It seems like something with the ai callbacks is not working. ai@2.1.13 moves to a new way of doing callbacks that the latest langchain requires: https://github.com/vercel-labs/ai/commit/8bf637adb03c61526ae5fa9d01357652662e539a

ejc3 commented 1 year ago

There's a lot of work the ai module is doing to try to handle all matters of callbacks: https://github.com/vercel-labs/ai/issues/205

MichaelKim39 commented 1 year ago

Thanks @ejc3! I will try that out

ejc3 commented 1 year ago

Unrelated to this bug, I feel like having having both langchain and vercel's ai libraries makes it more convoluted. Maybe in the future just one of the libraries could be used to make code easier to follow and less brittle.

jenniferli23 commented 1 year ago

Thanks for the feedback @ejc3, we'll definitely look into it and clean up the code.

Appreciate your input!