Open AumeshMisra opened 3 weeks ago
The issue you're encountering with type mismatches in createStuffDocumentsChain
is likely due to changes in the LangChain.js framework versions. Here are some steps to resolve the issue:
Ensure Compatibility: Verify that all LangChain.js packages are compatible with each other. The versions you mentioned might have introduced breaking changes. Try aligning the versions more closely to the ones that worked previously:
"@langchain/community": "^0.2.4",
"@langchain/core": "^0.2.2",
"@langchain/openai": "^0.0.34",
"langchain": "^0.2.5"
Type Assertion: If you are confident that the types are compatible at runtime, you can use type assertions to bypass TypeScript's static checks:
const model = new AzureOpenAI({
modelName: "gpt-4o",
temperature: 0.0,
}) as unknown as LanguageModelLike;
const prompt = PromptTemplate.fromTemplate(myPrompt) as unknown as BasePromptTemplate<any, BasePromptValueInterface, any>;
Check for Missing Properties: Ensure that AzureOpenAI
and PromptTemplate
include all properties and methods expected by LanguageModelLike
and BasePromptTemplate
, respectively. If there are any missing, you might need to implement or extend them.
Review Documentation: Check the latest documentation for any changes in the expected types or new requirements introduced in the newer versions of the packages.
If these steps do not resolve the issue, consider downgrading to the versions that worked previously until a more permanent fix is available.
For more details on similar issues, you can refer to the discussions and issues in the LangChain.js repository:
To continue talking to Dosu, mention @dosu.
Hey @AumeshMisra, sorry you're running into this - can you try setting your resolutions field like this to ensure that only one version of @langchain/core
is present?
https://js.langchain.com/v0.2/docs/how_to/installation/#installing-integration-packages
CC @bracesproul
Hi, I run into an issue, that might be related. I am following this tutorial here:
const examplePrompt = ChatPromptTemplate.fromMessages([
['human', '{input}'],
['ai', '{output}'],
]);
const fewShotPrompt = new FewShotChatMessagePromptTemplate({
examplePrompt,
examples: examplesFewShot,
inputVariables: [],
});
const finalPrompt = ChatPromptTemplate.fromMessages([
['system', systemTemplateTaskClassificationFewShot],
fewShotPrompt,
['human', '{input}'],
]);
My package.json looks like this:
"dependencies": {
"langchain": "^0.2.7",
"@langchain/azure-openai": "^0.0.11",
},
...,
"resolutions": {
"@langchain/core": "0.2.09"
}
I get the following TS error for using FewShotChatMessagePromptTemplate
in ChatPromptTemplate.fromMessages
:
Type 'FewShotChatMessagePromptTemplate<any, any>' is not assignable to type 'ChatPromptTemplate<InputValues, string> | BaseMessagePromptTemplateLike'.
Type 'FewShotChatMessagePromptTemplate<any, any>' is missing the following properties from type 'ChatPromptTemplate<InputValues, string>': promptMessages, _parseImagePromptsts(2322)
I would be very grateful for any input. I don't see where the usage is different from the tutorial.
Thank you very much!
@ik4Rus can you try without the 0
in front of 9
for resolutions?
So:
"resolutions": {
"@langchain/core": "0.2.9"
}
Checked other resources
Example Code
const prompt = PromptTemplate.fromTemplate(myPrompt); const model = new AzureOpenAI({ modelName: "gpt-4o", temperature: 0.0, });
Error Message and Stack Trace (if applicable)
No response
Description
I am trying to use createStuffDocumentsChain but it's unable to resolve types.
For the llm param, I get:
for the prompt param, I get:
The expected type comes from property 'prompt' which is declared here on type '{ llm: LanguageModelLike; prompt: BasePromptTemplate<any, BasePromptValueInterface, any>; outputParser?: BaseOutputParser<...> | undefined; documentPrompt?: BasePromptTemplate<...> | undefined; documentSeparator?: string | undefined; }'
This was fine when I had: "@langchain/community": "^0.2.4", "@langchain/core": "^0.2.2", "@langchain/openai": "^0.0.34",
But now there's an error when I have: "@langchain/community": "^0.2.4", "@langchain/core": "^0.2.6", "@langchain/openai": "^0.1.2",
I even made sure to add: "langchain": "^0.2.5", but I'm still getting issues there.
System Info
"@langchain/core": "^0.2.6", "@langchain/openai": "^0.1.2", "@langchain/pinecone": "0.0.6", "@pinecone-database/pinecone": "^2.2.2", "@prisma/client": "^5.14.0", "axios": "^1.6.8", "body-parser": "^1.20.2", "cors": "^2.8.5", "dotenv": "^16.4.5", "express": "^4.19.2", "langchain": "^0.2.5",