langchain-ai / langchainjs

🦜🔗 Build context-aware reasoning applications 🦜🔗
https://js.langchain.com/docs/
MIT License
11.74k stars 1.96k forks source link

TypeError: getDefaultAgent is not a function #5816

Open guidev opened 1 week ago

guidev commented 1 week ago

Checked other resources

Example Code

  1. git clone https://github.com/guidev/nonworking/tree/node20

  2. npm run start

Error Message and Stack Trace (if applicable)

TypeError: getDefaultAgent is not a function
    at OpenAI.buildRequest (file:///Users/name/Progetti/nonworking/node_modules/openai/core.mjs:208:66)
    at OpenAI.makeRequest (file:///Users/name/Progetti/nonworking/node_modules/openai/core.mjs:279:44)
    at async file:///Users/name/Progetti/nonworking/node_modules/@langchain/openai/dist/chat_models.js:796:29
    at async RetryOperation._fn (/Users/name/Progetti/nonworking/node_modules/p-retry/index.js:50:12)

Node.js v22.3.0

Description

When using openai-node with langchain and datadog, ESM, I get the following error:

System Info

    "@langchain/openai": "^0.1.3",
    "dd-trace": "^5.17.0",
    "dotenv": "^16.4.5",
    "express": "^4.19.2",
    "langchain": "0.2.5"
guidev commented 1 week ago

Also opened an issue at https://github.com/openai/openai-node/issues/903

dosubot[bot] commented 1 week ago

To resolve the "TypeError: getDefaultAgent is not a function" error when using openai-node with langchain and datadog, ensure you are using the correct function names and imports from the langchain library. There is no getDefaultAgent function exported from langchain/src/agents/toolkits/index.ts.

Instead, use one of the available agent creation functions such as createOpenAIToolsAgent. Here is an example:

import { createOpenAIToolsAgent } from 'langchain/src/agents/openai_tools/index.js';

const agent = await createOpenAIToolsAgent({
  llm: new ChatOpenAI({
    temperature: 0,
    modelName: "gpt-3.5-turbo-1106",
  }),
  tools: [
    // your tools here
  ],
  prompt: await pull<ChatPromptTemplate>("hwchase17/openai-tools-agent"),
  streamRunnable: true,
});

Make sure to replace getDefaultAgent with the correct function name from the available exports [1][2][3].

To continue talking to Dosu, mention @dosu.

guidev commented 1 week ago

@dosu

Here's the full code:

import './config/datadog.js';
import express from 'express';
import {ChatOpenAI} from "@langchain/openai";
import {PromptTemplate} from "@langchain/core/prompts";
import {JsonOutputParser} from "@langchain/core/output_parsers";
import dotenv from "dotenv";

dotenv.config();

const app = express();

const prompt = PromptTemplate.fromTemplate(`Return a JSON object with a 'text' field containing a joke`);

const llm = new ChatOpenAI({modelName: "gpt-4o", verbose: true, apiKey: "doesntmatter"})
    .bind({
        response_format: {
            type: "json_object",
        },
    });

const parser = new JsonOutputParser();

const chain = prompt.pipe(llm).pipe(parser);

console.log(await chain.invoke({}));

app.listen(process.env.PORT || 3000);
dosubot[bot] commented 1 week ago

To resolve the "TypeError: getDefaultAgent is not a function" error, you should use the createOpenAIFunctionsAgent method instead of the deprecated OpenAIAgent class. Here is an updated version of your code using the createOpenAIFunctionsAgent method:

import './config/datadog.js';
import express from 'express';
import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate } from "@langchain/core/prompts";
import { JsonOutputParser } from "@langchain/core/output_parsers";
import dotenv from "dotenv";
import { createOpenAIFunctionsAgent, AgentExecutor } from "langchain/agents";
import { pull } from "langchain/hub";
import { AIMessage, HumanMessage } from "@langchain/core/messages";

dotenv.config();

const app = express();

const prompt = await pull<PromptTemplate>("hwchase17/openai-functions-agent");

const llm = new ChatOpenAI({
  modelName: "gpt-4o",
  verbose: true,
  apiKey: process.env.OPENAI_API_KEY,
});

const agent = await createOpenAIFunctionsAgent({
  llm,
  tools: [], // Define your tools here
  prompt,
});

const agentExecutor = new AgentExecutor({
  agent,
  tools: [], // Define your tools here
});

const result = await agentExecutor.invoke({
  input: "Return a JSON object with a 'text' field containing a joke",
});

console.log(result);

app.listen(process.env.PORT || 3000);

Make sure to replace "hwchase17/openai-functions-agent" with the correct prompt template if needed and define any tools you want the agent to use. This should resolve the error and properly configure the agent [1].

To continue talking to Dosu, mention @dosu.