langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.7k stars 2.19k forks source link

ConversationalRetrievalQAChain with FirestoreChatMessageHistory: problem with chat_history #2227

Closed Zained closed 1 year ago

Zained commented 1 year ago

Hello,

I would like to call ConversationalRetrievalQAChain :

Did someone manage to get firestore history working with ConversationalRetrievalQAChain ?

Here a snippet of my code :

    const fileContent = fs.readFileSync("serviceAccountKey.json", 'utf8');
    const serviceAccount = JSON.parse(fileContent);

    admin_fs.initializeApp({
      credential: admin_fs.credential.cert(serviceAccount)
    });

    const vectorStore = await init_vectorStore();

    const chatHistory = new FirestoreChatMessageHistory({
        collectionName: "DEMO_LANGCHAIN",
        sessionId: "lc-example_8",
        userId: "test@test.com",
        config: {
            admin_fs
        },
    })

    const memory = new BufferMemory({
        chatHistory,
        memoryKey: "chat_history",
        inputKey: "question",
    });

    const model = new ChatOpenAI({
        openAIApiKey: '<API KEY>'
    });

    const chain = ConversationalRetrievalQAChain.fromLLM(
        model,
        vectorStore.asRetriever(),
        {
            memory: memory
        }
    );

    const res1 = await chain.call({ question: "Hello, my name is John", chat_history: chatHistory });

    console.log({ res1 });
    // { 
    //     res1: { 
    //         text: 'Hello John! How can I assist you today?' 
    //      } 
    // }

    const res2 = await chain.call({ question: "What's my name?", chat_history: chatHistory });

    console.log({ res2 });
    // {
    //     res2: {
    //         text: "I'm sorry, but I do not have access to personal information."
    //     }
    // }
dosubot[bot] commented 1 year ago

Answer generated by a 🤖

Answer

Hello,

Based on the context provided, it seems like you're trying to use Firestore as a memory store for the ConversationalRetrievalQAChain in the LangChain framework.

From the information I found in the repository, there was a similar issue titled Issues with ConversationalRetrievalQA chain. The user suggested that the issue might be due to the dataset used in the Supabase Vector Store. They recommended using a richer dataset, similar to the one used in the PR where built-in memory was introduced to the ConversationalRetrievalQAChain. They also suggested making sure that the latest version of LangChain is being used, as the memory feature was recently introduced.

However, it's important to note that the solution provided in the issue might not directly solve your problem as it doesn't specifically address Firestore integration.

In the context, there's a hypothetical solution suggesting the use of a FirestoreMemory class. However, please note that this class and its path "langchain/memory/firestore" are hypothetical and used as an example. The actual implementation may vary based on the LangChain framework's design and structure.

Here's the hypothetical solution:

import { FirestoreMemory } from "langchain/memory/firestore";

// ...

const chain = ConversationalRetrievalQAChain.fromLLM(
  slowerModel,
  vectorStore.asRetriever(),
  {
    returnSourceDocuments: true,
    memory: new FirestoreMemory({
      memoryKey: "chat_history",
      inputKey: "question", // The key for the input to the chain
      outputKey: "text", // The key for the final conversational output of the chain
      returnMessages: true, // If using with a chat model (e.g. gpt-3.5 or gpt-4)
      firestoreConfig: {
        // Your Firestore configuration here
      },
    }),
    questionGeneratorChainOptions: {
      llm: fasterModel,
    },
  }
);

Unfortunately, I couldn't find any specific documentation or code in the LangChain repository that confirms the existence of a FirestoreMemory class or similar. Therefore, I would recommend checking the latest version of LangChain (Release 0.0.125) and its documentation to see if there have been any updates regarding Firestore integration.

If you have any additional information or context that could help me better understand your issue, please provide it. I'm here to help!

Best, Dosu

Sources


This response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

jacoblee93 commented 1 year ago

I think the bot comment is actually directionally correct (with a hallucinated class haha) - I can't be sure without seeing the error but the conversational qa chain has two outputs, so you need to specify an output key to tell the memory which one to use. Try:

    const memory = new BufferMemory({
        chatHistory,
        memoryKey: "chat_history",
        inputKey: "question",
        outputKey: "text",
    });

Docs: https://js.langchain.com/docs/modules/chains/popular/chat_vector_db#built-in-memory

Zained commented 1 year ago

@jacoblee93 Even with your suggestion, the entry in firestore is written, but not read in the follow-up question. Basically it's not reading properly in database. It's working great with ConversationChain so I guess there is something wrong here, but I don't know where to dig. I will be patient and hope for an update! Thanks anyway