Closed marcusschiesser closed 8 months ago
LangchainJS also has a repo with examples: https://github.com/langchain-ai/langchain-nextjs-template
LangChain is more about building automated agents. Llamaindex is more about dealing with data. I guess we'll wanna decide which direction unc wants to go first?
My hunch is that for enterprises dealing with data is more important. What's why I added Llamaindex as an option. But LangChain is also used for data integration via it's support for vector stores
I analyzed Llamaindex a bit. Currently, they do not support summarizing the chat history which is a blocker. I sent them a PR and issue request (https://github.com/run-llama/LlamaIndexTS/issues/101)
LlamaIndex accepted my PR. The design is cleaner and better suited for RAG use cases. So, let's better use LlamaIndex.
So the first task would be to use LlamaIndex in session.ts
instead of the api.llm.chat
function.
However, I analyzed the code; the change is non-trivial as we inject the OpenAI key on the server side for shared bots.
Also, we're not directly benefiting from refactoring to LlamaIndex. We better use LlamaIndex first by implementing RAG (see #22).
using llamaindex, see https://github.com/marcusschiesser/unc-llamaindex
Currently, Unc is directly calling the LLM API. That makes it complicated to use vector stores, memory features, and different APIs.
Update
session.ts
by using one of the common LLM libraries:LangchainJS is more popular, so it looks like a valid starting point.