langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.84k stars 2.22k forks source link

DOC: For QA how to pass chat history in stream mode and also get source with rag chain? #5569

Closed djaffer closed 2 months ago

djaffer commented 6 months ago

Checklist

Issue with current documentation:

Getting this error TypeError: str.normalize is not a function at processInput (/app/node_modules/@tensorflow-models/universal-sentence-encoder/dist/tokenizer/index.js:65:26) at Tokenizer.encode (/app/node_modules/@tensorflow-models/universal-sentence-encoder/dist/tokenizer/index.js:87:17) at /app/node_modules/@tensorflow-models/universal-sentence-encoder/dist/index.js:128:86 at Array.map () at UniversalSentenceEncoder. (/app/node_modules/@tensorflow-models/universal-sentence-encoder/dist/index.js:128:44) at step (/app/node_modules/@tensorflow-models/universal-sentence-encoder/dist/index.js:48:23) at Object.next (/app/node_modules/@tensorflow-models/universal-sentence-encoder/dist/index.js:29:53) at /app/node_modules/@tensorflow-models/universal-sentence-encoder/dist/index.js:23:71 at new Promise ()

const questionPrompt = PromptTemplate.fromTemplate(
```

CONTEXT: {context}

CHAT HISTORY: {chatHistory}

QUESTION: {question}

Helpful Answer:` );

  const ragChainFromDocs = RunnableSequence.from([
    {
      question: (input) => input.question,
      chatHistory: (input) => input.chatHistory ?? '',
      context: async (input) => {
        const relevantDocs = await retriever.invoke(input.question);

        const serialized = formatDocumentsAsString(relevantDocs);
        return serialized;
      },
    },
    questionPrompt,
    this.llm,
    new StringOutputParser(),
  ]);

  let ragChainWithSource = new RunnableMap({
    steps: {
      context: retriever,
      question: new RunnablePassthrough(),
    },
  });
  ragChainWithSource = ragChainWithSource.assign({
    answer: ragChainFromDocs,
  });

  ragChainWithSource.invoke({question,chatHistory}) doesn't work

Idea or request for content:

No response

dosubot[bot] commented 6 months ago

To correctly pass chat history in the RAG chain invoke and avoid the TypeError: str.normalize is not a function error, ensure that the getMessageHistory function returns an instance of BaseChatMessageHistory or BaseListChatMessageHistory. Additionally, make sure that the historyMessagesKey is correctly set if you are using it.

Here is an example of how to set up and invoke the chain with history:

import { ChatOpenAI } from "@langchain/openai";
import {
  ChatPromptTemplate,
  MessagesPlaceholder,
} from "@langchain/core/prompts";
import {
  RunnableConfig,
  RunnableWithMessageHistory,
} from "@langchain/core/runnables";
import { ChatMessageHistory } from "@langchain/community/stores/message/in_memory";
import { HumanMessage, AIMessage } from "@langchain/core/messages";

// Construct your runnable with a prompt and chat model.
const model = new ChatOpenAI({});
const prompt = ChatPromptTemplate.fromMessages([
  ["ai", "You are a helpful assistant"],
  new MessagesPlaceholder("history"),
  ["human", "{input}"],
]);
const runnable = prompt.pipe(model);
const messageHistory = new ChatMessageHistory();

// Add initial messages to the history
await messageHistory.addMessage(new HumanMessage("hi"));
await messageHistory.addMessage(new AIMessage("what is up?"));

// Define a RunnableConfig object, with a `configurable` key.
const config: RunnableConfig = { configurable: { sessionId: "1" } };
const withHistory = new RunnableWithMessageHistory({
  runnable,
  getMessageHistory: (_sessionId: string) => messageHistory,
  inputMessagesKey: "input",
  historyMessagesKey: "history",
  // Passing config through here instead of through the invoke method
  config,
});

const output = await withHistory.invoke({
  input: "Hello there, I'm Archibald!",
});
console.log("output:", output);
/**
output: AIMessage {
  lc_namespace: [ 'langchain_core', 'messages' ],
  content: 'Hello, Archibald! How can I assist you today?',
  additional_kwargs: { function_call: undefined, tool_calls: undefined }
}
 */

/**
 * You can see the LangSmith traces here:
 * output @link https://smith.langchain.com/public/ee264a77-b767-4b5a-8573-efcbebaa5c80/r
 */

Alternatively, you can use a different message history store, such as Upstash Redis, as shown in the following example:

import {
  ChatPromptTemplate,
  MessagesPlaceholder,
} from "@langchain/core/prompts";
import { ChatAnthropic } from "@langchain/anthropic";
import { RunnableWithMessageHistory } from "@langchain/core/runnables";
import { UpstashRedisChatMessageHistory } from "@langchain/community/stores/message/upstash_redis";

const prompt = ChatPromptTemplate.fromMessages([
  ["system", "You're an assistant who's good at {ability}"],
  new MessagesPlaceholder("history"),
  ["human", "{question}"],
]);

const chain = prompt.pipe(
  new ChatAnthropic({ model: "claude-3-sonnet-20240229" })
);

const chainWithHistory = new RunnableWithMessageHistory({
  runnable: chain,
  getMessageHistory: (sessionId) =>
    new UpstashRedisChatMessageHistory({
      sessionId,
      config: {
        url: process.env.UPSTASH_REDIS_REST_URL!,
        token: process.env.UPSTASH_REDIS_REST_TOKEN!,
      },
    }),
  inputMessagesKey: "question",
  historyMessagesKey: "history",
});

const result = await chainWithHistory.invoke(
  {
    ability: "math",
    question: "What does cosine mean?",
  },
  {
    configurable: {
      sessionId: "foobarbaz",
    },
  }
);

console.log(result);

const result2 = await chainWithHistory.invoke(
  {
    ability: "math",
    question: "What's its inverse?",
  },
  {
    configurable: {
      sessionId: "foobarbaz",
    },
  }
);

console.log(result2);

Ensure that the getMessageHistory function is correctly implemented and returns the appropriate history object. This setup should help you avoid the TypeError: str.normalize is not a function error [1][2][3][4].

#### Details **About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

djaffer commented 5 months ago

any update on how to pass chat history with streaming?

dosubot[bot] commented 5 months ago

To address the issue, ensure that the chatHistory is correctly formatted and passed. Here’s a refined approach to integrate chat history in your RAG chain:

  1. Ensure Proper Chat History Handling: Make sure your chatHistory is an array of messages, each with a role (e.g., "human" or "ai") and content.

  2. Update Your Chain Setup: Modify your chain setup to correctly handle the chat history.

Here’s an example:

import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate, RunnableSequence, RunnableMap, RunnablePassthrough, StringOutputParser } from "@langchain/core";
import { ChatMessageHistory, HumanMessage, AIMessage } from "@langchain/community/stores/message/in_memory";

// Define your prompt template
const questionPrompt = PromptTemplate.fromTemplate(`
  ----------------
  CONTEXT: {context}
  ----------------
  CHAT HISTORY: {chatHistory}
  ----------------
  QUESTION: {question}
  ----------------
  Helpful Answer:
`);

// Define your RAG chain
const ragChainFromDocs = RunnableSequence.from([
  {
    question: (input) => input.question,
    chatHistory: (input) => input.chatHistory ?? '',
    context: async (input) => {
      const relevantDocs = await retriever.invoke(input.question);
      const serialized = formatDocumentsAsString(relevantDocs);
      return serialized;
    },
  },
  questionPrompt,
  new ChatOpenAI({}),
  new StringOutputParser(),
]);

let ragChainWithSource = new RunnableMap({
  steps: {
    context: retriever,
    question: new RunnablePassthrough(),
  },
});
ragChainWithSource = ragChainWithSource.assign({
  answer: ragChainFromDocs,
});

// Example chat history
const chatHistory = [
  { role: "human", content: "Hi" },
  { role: "ai", content: "Hello! How can I help you today?" },
];

// Invoke the chain with question and chat history
const result = await ragChainWithSource.invoke({
  question: "What is the capital of France?",
  chatHistory: chatHistory.map(msg => `${msg.role.toUpperCase()}: ${msg.content}`).join('\n')
});

console.log(result);

Ensure that chatHistory is properly formatted and passed as a string. This should help resolve the TypeError: str.normalize is not a function error. If the issue persists, double-check the dependencies and ensure all required packages are correctly installed and up-to-date.

To continue talking to Dosu, mention @dosu.