Open shirshendubhowmick opened 2 months ago
I found a similar closed issue that might be relevant to your problem: Zod and Enum support. This issue was resolved by updating to version 0.0.59 from 0.0.55 [1].
The type error you are encountering when using llm.withStructuredOutput(SomeZodSchema)
with different LLM providers in LangChain.js is likely because the LLM you are using does not support the withStructuredOutput
method. The withStructuredOutput
method is designed to work with LLMs that support structured output. It checks if the LLM object has a withStructuredOutput
method and throws an error if it does not:
if (typeof llm.withStructuredOutput !== "function") {
throw new Error(
"The specified LLM does not support the 'withStructuredOutput'. Please ensure you are using an LLM that supports this feature."
);
}
Additionally, the withStructuredOutput
method has specific requirements for the LLM to support structured output, such as binding tools and response formats. For example, in the case of GoogleAI, the method binds tools and sets the response format to json_object
if the method is in jsonMode
:
if (method === "jsonMode") {
throw new Error(`Google only supports "functionCalling" as a method.`);
}
Ensure that the LLM provider you are using supports the withStructuredOutput
method and meets the necessary requirements for structured output. For instance, GoogleAI only supports the "functionCalling" method and not "jsonMode" [2].
In LangChain.js, the BaseLanguageModel
class includes the withStructuredOutput
method, which is designed to return outputs formatted to match a given schema. This method can be used as follows:
withStructuredOutput?<
RunOutput extends Record<string, any> = Record<string, any>
>(
schema: z.ZodType<RunOutput> | Record<string, any>,
config?: StructuredOutputMethodOptions<boolean>
): Runnable<BaseLanguageModelInput, RunOutput> | Runnable<BaseLanguageModelInput, { raw: BaseMessage; parsed: RunOutput }>;
Make sure the LLM you are using extends from BaseLanguageModel
and implements the withStructuredOutput
method correctly. For example, the ChatOpenAI
class in LangChain.js supports structured output and includes the withStructuredOutput
method [3].
The ChatGroq
class, however, extends from BaseChatModel
, not from BaseLanguageModel
or any other class that implements the withStructuredOutput
method [4]. This could be the reason for the type incompatibility you are experiencing.
To resolve this issue, ensure that the LLM you are using is compatible with the withStructuredOutput
method and meets the necessary requirements for structured output.
To continue talking to Dosu, mention @dosu.
@dosu
I am using an LLM which compatible with withStructuredOutput
. This problem is with TypeScript types only.
If I just use any of the LLM either groq or OpenAI it works fine. Only creates type problem when the llm type is either group or openai
To resolve the TypeScript type incompatibility issue when using withStructuredOutput
with either ChatGroq
or ChatOpenAI
in LangChain.js, you need to ensure that the outputSchema
parameter and the config
parameter are correctly typed and passed. The withStructuredOutput
method has multiple overloads, and the correct one should be used based on whether includeRaw
is true
or boolean
.
Here is the relevant code snippet for the withStructuredOutput
method:
withStructuredOutput<
RunOutput extends Record<string, any> = Record<string, any>
>(
outputSchema:
| z.ZodType<RunOutput>
| Record<string, any>,
config?: StructuredOutputMethodOptions<false>
): Runnable<BaseLanguageModelInput, RunOutput>;
withStructuredOutput<
RunOutput extends Record<string, any> = Record<string, any>
>(
outputSchema:
| z.ZodType<RunOutput>
| Record<string, any>,
config?: StructuredOutputMethodOptions<true>
): Runnable<BaseLanguageModelInput, { raw: BaseMessage; parsed: RunOutput }>;
withStructuredOutput<
RunOutput extends Record<string, any> = Record<string, any>
>(
outputSchema:
| z.ZodType<RunOutput>
| Record<string, any>,
config?: StructuredOutputMethodOptions<boolean>
):
| Runnable<BaseLanguageModelInput, RunOutput>
| Runnable<
BaseLanguageModelInput,
{ raw: BaseMessage; parsed: RunOutput }
> {
const schema: z.ZodType<RunOutput> | Record<string, any> = outputSchema;
const name = config?.name;
const description = schema.description ?? "A function available to call.";
const method = config?.method;
const includeRaw = config?.includeRaw;
if (method === "jsonMode") {
throw new Error(`ChatBedrockConverse does not support 'jsonMode'.`);
}
// Additional implementation details...
}
Ensure that the outputSchema
and config
parameters are correctly typed and passed to the withStructuredOutput
method. This will help resolve the type incompatibility issue. Additionally, make sure to follow the setup and usage instructions for ChatGroq
and ChatOpenAI
as provided in the LangChain.js documentation [1][2].
If you are conditionally generating the LLM object, you might need to use a type assertion or a type guard to ensure TypeScript understands that the returned LLM object supports the withStructuredOutput
method. Here is an example of how you might do this:
export function getLangChainLlm(m?: Models, p?: ProvidersType): ChatGroq | ChatOpenAI {
const { model, provider } = getLlmModelAndProvider(m, p);
switch (provider) {
case Providers.Enum.groq:
return new ChatGroq({
model,
temperature: 0,
maxRetries: 2,
});
case Providers.Enum.openai:
return new ChatOpenAI({
model,
temperature: 0,
maxRetries: 2,
});
default:
throw new Error(`Unknown provider: ${provider}`);
}
}
const llm = getLangChainLlm();
if ('withStructuredOutput' in llm) {
llm.withStructuredOutput(SomeZodSchema);
}
This approach ensures that TypeScript recognizes that the llm
object has the withStructuredOutput
method, resolving the type incompatibility issue.
To continue talking to Dosu, mention @dosu.
@dosu I think this is a bug and needs human intervention
This seems like it could be a bug that needs your attention, @jacoblee93, could you take a look?
To continue talking to Dosu, mention @dosu.
Hey! Sorry you're running into this - can you confirm you've got the latest versions of @langchain/groq and OpenAI?
We just pushed 0.3.0 of core yesterday, updated docs will be coming soon but you'll need the latest for all packages to use it.
Hi @jacoblee93
Thanks for your quick reply.
Yes I just updated to 0.3.0, before creating this issue,
Here is how my package.json looks like
"@langchain/community": "^0.3.0",
"@langchain/core": "^0.3.0",
"@langchain/groq": "^0.1.1",
"@langchain/openai": "^0.3.0",
I checked the type def of both the packages:
https://github.com/langchain-ai/langchainjs/blob/main/libs/langchain-openai/src/chat_models.ts#L1716 https://github.com/langchain-ai/langchainjs/blob/main/libs/langchain-groq/src/chat_models.ts#L967
The only difference that I see is, the OpenAI one has an extra type in the union which is StructuredOutputMethodParams<RunOutput, false>
Got it - will have a look but OpenAI does accept a few unique args so it may be expected
Thanks @jacoblee93
In that case can we mention the types in a way so that that the TS type inference works correctly, this more seems to be a problem where TS is not able to infer the the type properly.
Checked other resources
Example Code
I generate the langchain LLM object conditionally like this:
Error Message and Stack Trace (if applicable)
While trying to use
llm.withStructuredOutput(SomeZodSchema)
, I get this type errorDescription
I think the langchain doc says it provides an unified interface for
withStructuredOutput
accross different LLMs. In that case the type should be compatible ?System Info
platform: Mac Node: 20.10.0 npm: 10.2.3