langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
11.87k stars 1.99k forks source link

Error TS2415: Class 'ChatOpenAI<CallOptions>' incorrectly extends base class 'BaseChatModel<CallOptions, AIMessageChunk>' #5675

Closed johnnyoshika closed 1 month ago

johnnyoshika commented 1 month ago

Checked other resources

Example Code

Compiling a project that uses ChatOpenAI from @langchain/openai during a Docker build results in an error. To reproduce:

package.json:

  "scripts": {
    "build": "tsc"
  }

Dockerfile:

FROM node:18

WORKDIR /usr/src/app

ENV PORT 8080

RUN npm install

COPY . .
RUN npm run build

CMD ["npm","start"]
docker build --no-cache -t container-name .

Error Message and Stack Trace (if applicable)

node_modules/@langchain/openai/dist/chat_models.d.ts(68,22): error TS2415: Class 'ChatOpenAI<CallOptions>' incorrectly extends base class 'BaseChatModel<CallOptions, AIMessageChunk>'.
'getLsParams' is protected in type 'ChatOpenAI<CallOptions>' but public in type 'BaseChatModel<CallOptions, AIMessageChunk>'.

Description

I'm trying to build a project inside of a Docker build process and it fails with the error listed above.

I have to enable "skipLibCheck": true in my tsconfig.json in order to get around this problem.

Note that running the same command (npm run build, which runs tsc) directly in macOS or Windows does not result in this error. It seems to only occur during a docker build.

System Info

Node: 18 langchain@0.2.4 langsmith@0.1.30 @langchain/openai@0.1.1

Running docker build --no-cache -t container-name . command in macOS Terminal.

dosubot[bot] commented 1 month ago

To resolve the TypeScript compilation error related to ChatOpenAI from @langchain/openai during a Docker build, you can try the following steps:

  1. Ensure all dependencies are installed correctly: Sometimes, Docker might not install all dependencies correctly. Make sure to run npm install inside your Docker container.

  2. Check TypeScript configuration: Ensure that your tsconfig.json is correctly configured and includes all necessary paths and settings. You can refer to the tsconfig.cjs.json used in @langchain/openai for guidance:

    {
      "extends": "./tsconfig.json",
      "compilerOptions": {
        "module": "commonjs",
        "declaration": false
      },
      "exclude": [
        "node_modules",
        "dist",
        "docs",
        "**/tests"
      ]
    }
  3. Install missing types: Sometimes, TypeScript might be missing type definitions. Ensure that you have installed all necessary type definitions, especially for @langchain/openai.

  4. Use a specific version of dependencies: There might be differences in how dependencies are resolved in Docker compared to macOS or Windows. Pin the versions of your dependencies in package.json. For example, ensure you are using compatible versions as specified in @langchain/openai:

    "dependencies": {
      "@langchain/core": ">=0.2.5 <0.3.0",
      "js-tiktoken": "^1.0.12",
      "openai": "^4.41.1",
      "zod": "^3.22.4",
      "zod-to-json-schema": "^3.22.3"
    },
    "devDependencies": {
      "@azure/identity": "^4.2.0",
      "@jest/globals": "^29.5.0",
      "@langchain/scripts": "~0.0.14",
      "@langchain/standard-tests": "workspace:*",
      "@swc/core": "^1.3.90",
      "@swc/jest": "^0.2.29",
      "dpdm": "^3.12.0",
      "eslint": "^8.33.0",
      "eslint-config-airbnb-base": "^15.0.0",
      "eslint-config-prettier": "^8.6.0",
      "eslint-plugin-import": "^2.27.5",
      "eslint-plugin-jest": "^27.6.0",
      "eslint-plugin-no-instanceof": "^1.0.1",
      "eslint-plugin-prettier": "^4.2.1",
      "jest": "^29.5.0",
      "jest-environment-node": "^29.6.4",
      "prettier": "^2.8.3",
      "release-it": "^15.10.1",
      "rimraf": "^5.0.1",
      "typescript": "~5.1.6"
    }
  5. Check for environment variables: The ChatOpenAI class requires certain environment variables to be set, such as OPENAI_API_KEY for OpenAI or AZURE_OPENAI_API_KEY, AZURE_OPENAI_API_INSTANCE_NAME, AZURE_OPENAI_API_DEPLOYMENT_NAME, and AZURE_OPENAI_API_VERSION for Azure. Ensure these are correctly set in your Docker environment.

Here is an example of how you might configure your Dockerfile to ensure all dependencies are installed correctly:

FROM node:18

# Create app directory
WORKDIR /usr/src/app

# Install app dependencies
COPY package*.json ./

RUN npm install

# Bundle app source
COPY . .

# Build the TypeScript code
RUN npm run build

# If you have a start script
CMD [ "npm", "start" ]

And ensure your tsconfig.json includes necessary configurations:

{
  "compilerOptions": {
    "target": "ES6",
    "module": "commonjs",
    "strict": true,
    "esModuleInterop": true,
    "skipLibCheck": true,
    "forceConsistentCasingInFileNames": true,
    "outDir": "./dist"
  },
  "include": ["src/**/*"],
  "exclude": ["node_modules", "**/*.spec.ts"]
}

Additionally, ensure that your environment variables are correctly set in your Docker environment. You can do this by adding the following to your Dockerfile:

# Set environment variables
ENV OPENAI_API_KEY=<your_openai_api_key>
ENV AZURE_OPENAI_API_KEY=<your_azure_openai_api_key>
ENV AZURE_OPENAI_API_INSTANCE_NAME=<your_azure_instance_name>
ENV AZURE_OPENAI_API_DEPLOYMENT_NAME=<your_azure_deployment_name>
ENV AZURE_OPENAI_API_VERSION=<your_azure_api_version>

If the issue persists, you might need to provide more specific error messages or logs to diagnose further. Additionally, ensure that the ChatOpenAI class and its dependencies are correctly imported and used as per the documentation:

import { ChatOpenAI } from "@langchain/openai";

// Example usage
const model = new ChatOpenAI({
  temperature: 0.9,
  model: "ft:gpt-3.5-turbo-0613:{ORG_NAME}::{MODEL_ID}",
});

const message = await model.invoke("Hi there!");
console.log(message);

Make sure that all required environment variables and configurations are correctly set up as described in the ChatOpenAI class documentation [1][2][3][4][5].

To continue talking to Dosu, mention @dosu.

johnnyoshika commented 1 month ago

I created this repo to easily reproduce the bug in 4 easy steps. It turns out that this problem even occurs outside of docker build step: https://github.com/johnnyoshika/langchain-ts2415

Demo:

https://github.com/langchain-ai/langchainjs/assets/504505/5ad6ad7b-ecdd-40c7-8f28-0adff9f44d56

viswak-cn commented 1 month ago

I can confirm that I am facing a related issue when I am trying to do something like the following:

import { createToolCallingAgent } from 'langchain/agents'
import { ChatOpenAI } from '@langchain/openai'
.
.
.
this.client = new ChatOpenAI({
    model: 'gpt-4o',
    temperature: 0,
})
.
.
.
const agent = createToolCallingAgent({
    llm: this.client,  // <---- The point where error occurs (even my TS linter is able to point it out)
    tools,
    prompt,
})

I get the following error:

error TS2322: Type 'ChatOpenAI<ChatOpenAICallOptions>' is not assignable to type 'BaseChatModel<BaseLanguageModelCallOptions, BaseMessageChunk>'.
  Property 'getLsParams' is protected in type 'ChatOpenAI<ChatOpenAICallOptions>' but public in type 'BaseChatModel<BaseLanguageModelCallOptions, BaseMessageChunk>'
bracesproul commented 1 month ago

Sorry you were getting this error! If you upgrade to the latest version of @langchain/openai (0.1.2) it should be resolved. The issue here was I updated all instances of getLsParams to be public, instead of protected, but only cut a release for @langchain/core, meaning the released code for openai/other partner packages was incompatible with the released code in @langchain/core.

We're taking steps to add tests for this, so this type of issue can be caught in the future.

johnnyoshika commented 1 month ago

@bracesproul thanks for the quick fix!

AumeshMisra commented 1 month ago

Hi @bracesproul, I am seeing a similar issue with: https://github.com/langchain-ai/langchainjs/issues/5736 even with the latest packages.