langchain-ai / langchainjs

šŸ¦œšŸ”— Build context-aware reasoning applications šŸ¦œšŸ”—
https://js.langchain.com/docs/
MIT License
12.15k stars 2.04k forks source link

Type 'ChatOpenAI<ChatOpenAICallOptions>' is not assignable to type 'RunnableLike<any, any>' #5565

Closed bianbianzhu closed 3 months ago

bianbianzhu commented 3 months ago

Checked other resources

Example Code

The following code:

  const model = new ChatOpenAI({
    model: "gpt-4o",
  });

  const chain = RunnableSequence.from([
    model,
  ]);

Error Message and Stack Trace (if applicable)

Type 'ChatOpenAI' is not assignable to type 'RunnableLike<any, any>'. Type 'ChatOpenAI' is not assignable to type 'RunnableMapLike<any, any>'. Index signature for type 'string' is missing in type 'ChatOpenAI'.ts(2322)

Description

Simply use any examples from the Langchain js official website. It seams that the ChatOpenAI<ChatOpenAICallOptions> class from OpenAI is not assignable to Runnable

System Info

"langchain": "^0.2.2" "typescript": "^5.4.5" Node: v20.12.1 yarn: 1.22.22

tsconfig:

{
  "compilerOptions": {
    /* Language and Environment */
    "target": "ES2022",
    "moduleDetection": "force",

    /* Modules */
    "module": "commonjs",
    "rootDir": "src",
    "resolveJsonModule": true,

    /* JavaScript Support */
    "allowJs": true,

    /* Emit */
    "sourceMap": true,
    "outDir": "dist",

    /* Interop Constraints */
    "isolatedModules": true,
    "esModuleInterop": true,
    "forceConsistentCasingInFileNames": true,
    /* Type Checking */
    "strict": true,
    "noImplicitAny": true,
    "useUnknownInCatchVariables": true,
    // "noUnusedLocals": true,
    // "noUnusedParameters": true,
    "noUncheckedIndexedAccess": true,
    "skipLibCheck": true
  }
}
dosubot[bot] commented 3 months ago

Hey @bianbianzhu! šŸ‘‹ I'm here to help you with bugs, answer questions, and even guide you on contributing. Let's tackle this together while we wait for a human maintainer.

The error you're encountering is due to the ChatOpenAI class not being directly assignable to RunnableLike. To resolve this, you can use the createStructuredOutputRunnable method to wrap the ChatOpenAI instance, ensuring it conforms to the expected type.

Here's how you can modify your code:

import { ChatOpenAI } from '@langchain/openai';
import { createStructuredOutputRunnable } from 'langchain/core/runnables';
import { ChatPromptTemplate } from '@langchain/core/prompts';
import { JsonOutputFunctionsParser } from 'langchain/output_parsers';
import { z } from 'zod';

// Define the expected output schema using Zod
const outputSchema = z.object({
  key1: z.string(),
  key2: z.number(),
});

// Create an instance of ChatOpenAI
const model = new ChatOpenAI({
  model: "gpt-4",
});

// Create a prompt template
const prompt = ChatPromptTemplate.fromMessages([
  ["human", "Human description: {description}"],
]);

// Create an output parser
const outputParser = new JsonOutputFunctionsParser();

// Use the createStructuredOutputRunnable method to define the expected output schema
const runnable = createStructuredOutputRunnable({
  outputSchema,
  llm: model,
  prompt,
  outputParser
});

// Now you can use the runnable within a RunnableSequence
const chain = RunnableSequence.from([runnable]);

// Invoke the sequence with your input
const result = await chain.invoke({ description: "Your input message" });

// Log the result
console.log(result);

This approach ensures that the ChatOpenAI instance is correctly wrapped and assignable to RunnableLike [1][2].

#### Details **About Dosu** This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a šŸ‘ if this is helpful and šŸ‘Ž if it is irrelevant.

To continue the conversation, mention @dosu.

logemann commented 3 months ago

Just wanted to report the same bug. The AI answer mentions a deprecated API so i wont dig deeper into this solution....

But for more context in that problem space. The alternative to RunnableSequence would be to use pipe which also has type problems. So this doesnt work either:

const chain = prompt.pipe(model).pipe(outputParser);

But back to RunnableSequence. When leaving out the outputParser (HttpResponseOutputParser()) this is ok for the compiler:

    const chainX = RunnableSequence.from([prompt, new ChatOpenAI({
      model: "gpt-4",
    })]);

And now the super weird thing, this isnt:

    const model2 = new ChatOpenAI({
      model: "gpt-4",
    })
    const chainX = RunnableSequence.from([prompt, model2]);

At this point i thought my TS compiler must be broken because i ve never seen a difference between inlining and non-inlining when it comes to compiling / linting.

bianbianzhu commented 3 months ago

I've managed to fix this issue by completely removing all the dependencies and reinstalling them. Caching issue maybe? šŸ˜–šŸ˜–šŸ˜–

leitdeux commented 3 months ago

Like @bianbianzhu I also had to manually remove (npm rm ...) each langchain dependency and only then attempt to re-install them. Simply deleting node_modules and running npm i didn't have an effect. Once I removed and re-installed each package, the TS compiler errors disappeared (ones such as ChatPromptTemplate is not assignable to Runnable).

logemann commented 3 months ago

Hmm. Will test this, but this doesnt explain why even a NextJS Starter template with langchain (https://vercel.com/templates/next.js/langchain-starter) instantly breaks on first deploy on Vercel. You cant have a more "green field" application than that when it comes to node_modules.

Uodate: as expected... removing and adding deps doesnt change anything in my setup.

Update2: After going full circle, meaning, deleteing node_modules, yarn.lock, yarn cache and whatnot, compiler doesnt complain anymore. I still dont get why vercel cant build the mentioned starter template though. This of course has nothing to do with any local dependency state. Also upgraded \@langchain modules to their latest version (compared to the one in the starter template)

bracesproul commented 3 months ago

hey @bianbianzhu this doc should have the answer you're looking for. This issue should be fixed when you set overrides/resolutions/the proper field for your pkg manager for @langchain/core

I also suggest installing @langchain/core as a hard dependency, along with your other LangChain packages.

IgorHalfeld commented 2 months ago

I've managed to fix this issue by completely removing all the dependencies and reinstalling them. Caching issue maybe? šŸ˜–šŸ˜–šŸ˜–

@bianbianzhu it worked here