langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.15k stars 2.04k forks source link

Typescript Error: RunnableSequence cannot take ChatPromptTemplate as parameter #4511

Closed nnduc1994 closed 2 months ago

nnduc1994 commented 6 months ago

As the title, I am trying to follow an example of RunnableSequence from here: https://js.langchain.com/docs/expression_language/interface#invoke

However got a following typescript error:

Argument of type '[ChatPromptTemplate<ParamsFromFString<"Use the rules about Uno, to answer user question: about Uno">, any>]' is not assignable to parameter of type '[RunnableLike<any, any>, ...RunnableLike<any, any>[], RunnableLike<any, any>]
dosubot[bot] commented 6 months ago

🤖

Hey there, @nnduc1994! Long time no see, hope you're doing awesome. Caught yourself in a TypeScript tangle, huh? Let's see what we can do about that.

Based on the error message you provided, it seems like you're trying to pass an array of ChatPromptTemplate instances to the RunnableSequence constructor. However, RunnableSequence expects an array of RunnableLike instances.

ChatPromptTemplate and RunnableLike are different types. ChatPromptTemplate is a class for creating chat prompt templates, while RunnableLike is a type that represents objects that can be run.

If you're trying to create a sequence of chat prompts, you might want to create a ChatPromptTemplate instance for each prompt, and then pass these instances to the RunnableSequence constructor. However, you'll need to ensure that these ChatPromptTemplate instances are compatible with RunnableLike.

Here's an example of how you might do this:

import { RunnableSequence, ChatPromptTemplate } from 'langchainjs';

const chatPrompt1 = new ChatPromptTemplate(/* parameters */);
const chatPrompt2 = new ChatPromptTemplate(/* parameters */);

const runnableSequence = new RunnableSequence(chatPrompt1, chatPrompt2);

In this example, chatPrompt1 and chatPrompt2 are instances of ChatPromptTemplate, and they're passed as arguments to the RunnableSequence constructor.

Please note that this is just a general example. The exact code might vary depending on the specific parameters and methods of your ChatPromptTemplate and RunnableSequence classes.

If you're still encountering issues, could you please provide more details about your ChatPromptTemplate and RunnableSequence classes, and how you're trying to use them? This would help me provide a more accurate solution.

For more information about ChatPromptTemplate and RunnableLike, you can refer to the langchain-core/src/prompts/chat.ts and langchain-core/src/runnables/runnable.ts files in the langchainjs repository.

I hope this helps! If you have any other questions, feel free to ask.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
nnduc1994 commented 6 months ago

@dosu-bot

Firstly, the import in your example is incorrect. There is no package call 'langchainjs', the package is call 'langchain'

I am following this example code:

import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate } from "@langchain/core/prompts";
import { RunnableSequence } from "@langchain/core/runnables";

const model = new ChatOpenAI({});
const promptTemplate = PromptTemplate.fromTemplate(
  "Tell me a joke about {topic}"
);

// You can also create a chain using an array of runnables
const chain = RunnableSequence.from([promptTemplate, model]);

const result = await chain.invoke({ topic: "bears" });

console.log(result);

are you saying that the example in langchain doc is out of date ?

dosubot[bot] commented 6 months ago

🤖

Based on the error message you provided, it seems like TypeScript is expecting an array of RunnableLike instances, but you're providing an array of ChatPromptTemplate instances. The RunnableLike type is a union of several types, including RunnableInterface, RunnableFunc, and RunnableMapLike.

In your case, you're trying to create a RunnableSequence from an array of ChatPromptTemplate instances. However, ChatPromptTemplate is not a RunnableLike type, hence the TypeScript error.

The RunnableSequence constructor expects an array of RunnableLike instances. Here's the type definition for RunnableLike:

export type RunnableLike<RunInput = any, RunOutput = any> =
  | RunnableInterface<RunInput, RunOutput>
  | RunnableFunc<RunInput, RunOutput>
  | RunnableMapLike<RunInput, RunOutput>;

This means that each element in the array should be either a RunnableInterface, RunnableFunc, or RunnableMapLike.

In your code, you're trying to pass a ChatPromptTemplate to the RunnableSequence constructor, which is not a RunnableLike. You should instead pass a RunnableLike to the RunnableSequence constructor.

Here's how you can modify your code:

import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate, ChatPromptTemplate } from "@langchain/core/prompts";
import { RunnableSequence } from "@langchain/core/runnables";

const model = new ChatOpenAI({});
const promptTemplate = PromptTemplate.fromTemplate(
  "Tell me a joke about {topic}"
);

// Create a ChatPromptTemplate from the PromptTemplate
const chatPromptTemplate = new ChatPromptTemplate(promptTemplate, model);

// You can also create a chain using an array of runnables
const chain = RunnableSequence.from([chatPromptTemplate]);

const result = await chain.invoke({ topic: "bears" });

console.log(result);

In this modified code, a ChatPromptTemplate is created from the PromptTemplate and the ChatOpenAI model. This ChatPromptTemplate is a RunnableLike, so it can be passed to the RunnableSequence constructor.

I hope this helps! If you have any other questions, feel free to ask.

For more information, you can refer to the LangChainJS documentation and the LangChainJS codebase.

-Dosu

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
jacoblee93 commented 6 months ago

Can you share your package.json file and your tsconfig.json file?

nnduc1994 commented 6 months ago

Hi, @jacoblee93

Here are tsconfig.json and package.json

tsconfig.json

{
  "compilerOptions": {
    "target": "ES2020",
    "module": "NodeNext",
    "strictNullChecks": true,
    "esModuleInterop": true,
    "moduleResolution": "Bundler"
  },
  "include": ["src/**/*.ts"]
}

package.json

{
  "engines": {
    "node": ">=18"
  },
  "dependencies": {
    "@pinecone-database/pinecone": "^1.1.1",
    "colors": "^1.4.0",
    "dotenv": "^16.3.1",
    "langchain": "^0.0.167",
    "pdf-parse": "^1.1.1",
    "readline-sync": "^1.4.10",
    "@langchain/core":"0.1.5"
  },
  "devDependencies": {
    "@types/prompt-sync": "^4.2.1",
    "@types/readline-sync": "^1.4.5",
    "typescript": "^5.2.2"
  },
  "overrides": {
    "@langchain/core": "0.1.5"
  }
}
nnduc1994 commented 6 months ago

Here is the code that I have an error with (almost similar to an example in doc)

import "dotenv/config";

import * as path from "path";
import { PromptTemplate } from "@langchain/core/prompts";
import { ChatOpenAI } from "langchain/chat_models/openai";
import { PDFLoader } from "langchain/document_loaders/fs/pdf";
import { RecursiveCharacterTextSplitter } from "langchain/text_splitter";
import { MemoryVectorStore } from "langchain/vectorstores/memory";
import { OpenAIEmbeddings } from "langchain/embeddings/openai";
import { StringOutputParser } from "langchain/schema/output_parser";
import { RunnableSequence } from "@langchain/core/runnables";

const chatModel = new ChatOpenAI({
  temperature: 0.9,
  // openAIApiKey: OPENAI_API_KEY
});

const loader = new PDFLoader(path.join(__dirname, "./text/uno_rules.pdf"));
const splitter = new RecursiveCharacterTextSplitter({
  chunkSize: 128,
  chunkOverlap: 0,
});
const embeddings = new OpenAIEmbeddings();
const output = new StringOutputParser();
const vectorstore = new MemoryVectorStore(embeddings);

const chatbot = async () => {
  //console.log(colors.bold.green("Chatbot started, type exit to quit"));

  const UnoRulesDocs = await loader.load();
  const splitDocs = await splitter.splitDocuments(UnoRulesDocs);

  await vectorstore.addDocuments(splitDocs);
  const retriver = vectorstore.asRetriever();

  const promptTemplate = PromptTemplate.fromTemplate(
    "Use the rules about Uno, to answer user question: about Uno. Question: {question}"
  );

  /// TS ERROR HERE
  const unoChain = RunnableSequence.from([promptTemplate, chatModel, output]);

  const respone = await unoChain.invoke({ question: "can I play +2 after +2" });

  console.log(respone);
};

chatbot();
w-biggs commented 5 months ago

Seeing the exact same issue. The full type error is the following:

Argument of type '[PromptTemplate<ParamsFromFString<"Prompt goes here.\n{format_instructions}">, any>, ChatOpenAI<...>, StructuredOutputParser<...>]' is not assignable to parameter of type '[RunnableLike<ParamsFromFString<"You are a image analysis assistant capable of deducing what species of organism is shown in a photograph.\n{format_instructions}">, any>, ...RunnableLike<...>[], RunnableLike<...>]'.
  Type at position 2 in source is not compatible with type at position 2 in target.
    Type 'StructuredOutputParser<ZodObject<{ taxon: ZodString; description: ZodString; }, "strip", ZodTypeAny, { taxon: string; description: string; }, { taxon: string; description: string; }>>' is not assignable to type 'RunnableLike<any, { taxon: string; description: string; }>'.
      Type 'StructuredOutputParser<ZodObject<{ taxon: ZodString; description: ZodString; }, "strip", ZodTypeAny, { taxon: string; description: string; }, { taxon: string; description: string; }>>' is not assignable to type 'RunnableInterface<any, { taxon: string; description: string; }, RunnableConfig>'.
        Types of property 'invoke' are incompatible.
          Type '(input: string | BaseMessage, options?: RunnableConfig | undefined) => Promise<{ taxon: string; description: string; }>' is not assignable to type '(input: any, options?: Partial<RunnableConfig> | undefined) => Promise<{ taxon: string; description: string; }>'.
            Types of parameters 'options' and 'options' are incompatible.
              Type 'Partial<RunnableConfig> | undefined' is not assignable to type 'RunnableConfig | undefined'.
                Type 'Partial<RunnableConfig>' is not assignable to type 'RunnableConfig'.
                  Types of property 'callbacks' are incompatible.
                    Type 'import("/Users/<user>/Documents/dev/vision-search/node_modules/@langchain/core/dist/callbacks/manager", { with: { "resolution-mode": "import" } }).Callbacks | undefined' is not assignable to type 'import("/Users/DKS0662770/Documents/dev/vision-search/node_modules/langchain/node_modules/@langchain/core/dist/callbacks/manager", { with: { "resolution-mode": "import" } }).Callbacks | undefined'.
                      Type 'CallbackManager' is not assignable to type 'Callbacks | undefined'.
                        Property 'getParentRunId' is missing in type 'import("/Users/<user>/Documents/dev/vision-search/node_modules/@langchain/core/dist/callbacks/manager", { with: { "resolution-mode": "import" } }).CallbackManager' but required in type 'import("/Users/DKS0662770/Documents/dev/vision-search/node_modules/langchain/node_modules/@langchain/core/dist/callbacks/manager", { with: { "resolution-mode": "import" } }).CallbackManager'
jacoblee93 commented 5 months ago

I believe it's two different versions of core getting pulled in - what package manager are you using?

Can you try the instructions here?

https://js.langchain.com/docs/get_started/installation#installing-integration-packages

nnduc1994 commented 5 months ago

Hi,

Forget to update, I am not 100% sure if it is a root cause (I just threw the whole code above away and tried to copy-paste the code from the document example). However, I think the issue might lie in thelangchain/core and langchain version.

After updating langchain from ^0.0.167 to ^0.1.25 and @langchain/core from ^0.1.5 to ^0.1.41. Problem seems to be solved

nnduc1994 commented 5 months ago

@jacoblee93 Maybe this is not a right place, but I would like to propose some improvement in the doc here https://js.langchain.com/docs/get_started/installation#langchaincore

I had a hard time understanding when to import from @langchain and when to import from @langchain/core. Especially I didn't get if I need to install @langchain/core separately or not because to document says It is automatically installed along with langchain, but can also be used separately

I opened a QA question and dosubot did help me to understand it better: https://github.com/langchain-ai/langchainjs/discussions/4647#discussioncomment-8694218

dosubot[bot] commented 5 months ago

I understand your confusion regarding when to import from @langchain and when to import from @langchain/core, and whether @langchain/core needs to be installed separately. Let's clarify these points:

  1. When to import from @langchain vs @langchain/core: You should import from @langchain when you are using the main LangChain package, which includes the core functionalities along with some default integrations. This package is suitable for general use cases where the default set of features and integrations provided by LangChain are sufficient for your project's needs. On the other hand, you should import from @langchain/core when you need to use the base abstractions that the rest of the LangChain ecosystem uses, along with the LangChain Expression Language. This package is necessary when you are building custom integrations or when you want to ensure that all integration packages in your project are using the same version of the core abstractions.

  2. Installation of @langchain/core: Although @langchain/core is automatically installed with langchain, it can also be used separately. This indicates that while @langchain/core is a dependency of the main langchain package and gets installed with it, there might be scenarios where you want to explicitly install or update @langchain/core to a specific version to maintain compatibility across different integration packages you are using. This is further emphasized by the documentation suggesting the use of "resolutions" or "overrides" in your package.json to ensure all integrations use the same version of @langchain/core.

In summary, @langchain/core does not need to be installed separately for general use cases since it is automatically included with the langchain package. However, for advanced use cases, especially when dealing with multiple integration packages, you might want to explicitly manage the version of @langchain/core used in your project to ensure compatibility and consistency across different integrations.

As for the specific functionalities provided by the langchain package that are not available in the @langchain/core package, I wasn't able to find an answer within the repository. However, the general rule of thumb is that langchain provides a broader set of features and utilities for working with language models and integrations, while @langchain/core provides the base abstractions used by the rest of the LangChain ecosystem.

I hope this clarifies your question. If you need further assistance, feel free to ask.

For more details, you can refer to the following sources:

Dosu

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
jacoblee93 commented 5 months ago

Thank you, yes we should add something about that. Will reopen and make a note.