Open guidev opened 1 week ago
Also opened an issue at https://github.com/openai/openai-node/issues/903
To resolve the "TypeError: getDefaultAgent is not a function" error when using openai-node
with langchain
and datadog
, ensure you are using the correct function names and imports from the langchain
library. There is no getDefaultAgent
function exported from langchain/src/agents/toolkits/index.ts
.
Instead, use one of the available agent creation functions such as createOpenAIToolsAgent
. Here is an example:
import { createOpenAIToolsAgent } from 'langchain/src/agents/openai_tools/index.js';
const agent = await createOpenAIToolsAgent({
llm: new ChatOpenAI({
temperature: 0,
modelName: "gpt-3.5-turbo-1106",
}),
tools: [
// your tools here
],
prompt: await pull<ChatPromptTemplate>("hwchase17/openai-tools-agent"),
streamRunnable: true,
});
Make sure to replace getDefaultAgent
with the correct function name from the available exports [1][2][3].
To continue talking to Dosu, mention @dosu.
@dosu
Here's the full code:
import './config/datadog.js';
import express from 'express';
import {ChatOpenAI} from "@langchain/openai";
import {PromptTemplate} from "@langchain/core/prompts";
import {JsonOutputParser} from "@langchain/core/output_parsers";
import dotenv from "dotenv";
dotenv.config();
const app = express();
const prompt = PromptTemplate.fromTemplate(`Return a JSON object with a 'text' field containing a joke`);
const llm = new ChatOpenAI({modelName: "gpt-4o", verbose: true, apiKey: "doesntmatter"})
.bind({
response_format: {
type: "json_object",
},
});
const parser = new JsonOutputParser();
const chain = prompt.pipe(llm).pipe(parser);
console.log(await chain.invoke({}));
app.listen(process.env.PORT || 3000);
To resolve the "TypeError: getDefaultAgent is not a function" error, you should use the createOpenAIFunctionsAgent
method instead of the deprecated OpenAIAgent
class. Here is an updated version of your code using the createOpenAIFunctionsAgent
method:
import './config/datadog.js';
import express from 'express';
import { ChatOpenAI } from "@langchain/openai";
import { PromptTemplate } from "@langchain/core/prompts";
import { JsonOutputParser } from "@langchain/core/output_parsers";
import dotenv from "dotenv";
import { createOpenAIFunctionsAgent, AgentExecutor } from "langchain/agents";
import { pull } from "langchain/hub";
import { AIMessage, HumanMessage } from "@langchain/core/messages";
dotenv.config();
const app = express();
const prompt = await pull<PromptTemplate>("hwchase17/openai-functions-agent");
const llm = new ChatOpenAI({
modelName: "gpt-4o",
verbose: true,
apiKey: process.env.OPENAI_API_KEY,
});
const agent = await createOpenAIFunctionsAgent({
llm,
tools: [], // Define your tools here
prompt,
});
const agentExecutor = new AgentExecutor({
agent,
tools: [], // Define your tools here
});
const result = await agentExecutor.invoke({
input: "Return a JSON object with a 'text' field containing a joke",
});
console.log(result);
app.listen(process.env.PORT || 3000);
Make sure to replace "hwchase17/openai-functions-agent"
with the correct prompt template if needed and define any tools you want the agent to use. This should resolve the error and properly configure the agent [1].
To continue talking to Dosu, mention @dosu.
Checked other resources
Example Code
git clone https://github.com/guidev/nonworking/tree/node20
npm run start
Error Message and Stack Trace (if applicable)
Description
When using openai-node with langchain and datadog, ESM, I get the following error:
System Info