langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
12.39k stars 2.09k forks source link

AzureOpenAI Missing Required Properties for BaseChatModel #6886

Open Nicolas705 opened 2 days ago

Nicolas705 commented 2 days ago

Checked other resources

Example Code

import { AzureOpenAI } from "@langchain/openai"; import { createToolCallingAgent } from "langchain/agents";

const llm = new AzureOpenAI({ model: "gpt-4o", azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY, azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME, azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME, azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION, temperature: 0, // other params... });

const agent = createToolCallingAgent({ llm, tools, prompt, });

Error Message and Stack Trace (if applicable)

[{ "resource": "/Users/nicolasgertler/.../agent.ts", "owner": "typescript", "code": "2739", "severity": 8, "message": "Type 'AzureOpenAI' is missing the following properties from type 'BaseChatModel<BaseChatModelCallOptions, BaseMessageChunk>': getLsParams, callPrompt", "source": "ts", "startLineNumber": 741, "startColumn": 9, "endLineNumber": 741, "endColumn": 12, "relatedInformation": [ { "startLineNumber": 16, "startColumn": 5, "endLineNumber": 16, "endColumn": 8, "message": "The expected type comes from property 'llm' which is declared here on type 'CreateToolCallingAgentParams'", "resource": "/Users/nicolasgertler/.../node_modules/langchain/dist/agents/tool_calling/index.d.ts" } ] }]

Description

When using the AzureOpenAI class from the @langchain/openai package, a TypeScript error occurs indicating that the class is missing required properties (getLsParams and callPrompt) from the BaseChatModel interface. This error prevents the AzureOpenAI class from being used as an llm in the createToolCallingAgent function.

This error is a LangChain issue because the AzureOpenAI class provided by the @langchain/openai package does not fully implement the BaseChatModel interface required by the createToolCallingAgent function. The missing methods (getLsParams and callPrompt) are expected by LangChain's internal type definitions, indicating a gap in the library's implementation.

System Info

@langchain/community: ^0.2.16

  1. @langchain/core: ^0.2.16
    1. @langchain/openai: ^0.2.6
dosubot[bot] commented 2 days ago

I found a similar open issue that might be relevant to your problem: Typing issue with langchain, @langchain/core, and @langchain/openai [1]. This issue discusses type mismatches and provides some potential solutions.

To resolve the TypeScript error indicating that the AzureOpenAI class is missing required properties (getLsParams and callPrompt) from the BaseChatModel interface, you need to ensure that the AzureOpenAI class implements these methods.

Here is an example of how you can extend the AzureOpenAI class to include the missing methods:

import { AzureOpenAI } from '@langchain/openai';
import { BasePromptValueInterface, BaseMessage, CallOptions, Callbacks } from '@langchain/core';

class CustomAzureOpenAI extends AzureOpenAI {
  getLsParams(options: this["ParsedCallOptions"]): LangSmithParams {
    return {
      ls_model_type: "chat",
      ls_stop: options.stop,
    };
  }

  async callPrompt(
    promptValue: BasePromptValueInterface,
    options?: string[] | CallOptions,
    callbacks?: Callbacks
  ): Promise<BaseMessage> {
    const promptMessages: BaseMessage[] = promptValue.toChatMessages();
    return this.call(promptMessages, options, callbacks);
  }
}

const llm = new CustomAzureOpenAI({
  model: "gpt-4o",
  azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
  azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_API_INSTANCE_NAME,
  azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
  azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
  temperature: 0,
  // other params...
});

const agent = createToolCallingAgent({
  llm,
  tools,
  prompt,
});

By extending the AzureOpenAI class and adding the getLsParams and callPrompt methods, you ensure that it conforms to the BaseChatModel interface, resolving the TypeScript error [2][3].

To continue talking to Dosu, mention @dosu.