Closed canberkkoc1 closed 9 months ago
🤖
Hello @canberkkoc1! 👋
I'm Dosu, a friendly bot designed to assist you in navigating through our codebase, debugging issues, and becoming a contributor. While we're waiting for a human maintainer to join us, I'm here to help you out. Looking forward to our coding adventure together!
Based on the error message you're encountering, it seems like there's an issue with the chatMessage._getType
function. This function is a method defined in the ChatMessage
class in the langchainjs codebase. It is used to get the type of the message. For the ChatMessage
class, this method returns the string "generic".
Here is the definition of the _getType
function in the ChatMessage
class:
/**
* Represents a chat message in a conversation.
*/
export class ChatMessage
extends BaseMessage
implements ChatMessageFieldsWithRole
{
static lc_name() {
return "ChatMessage";
}
role: string;
constructor(content: string, role: string);
constructor(fields: ChatMessageFieldsWithRole);
constructor(fields: string | ChatMessageFieldsWithRole, role?: string) {
if (typeof fields === "string") {
// eslint-disable-next-line no-param-reassign, @typescript-eslint/no-non-null-assertion
fields = { content: fields, role: role! };
}
super(fields);
this.role = fields.role;
}
_getType(): MessageType {
return "generic";
}
static isInstance(message: BaseMessage): message is ChatMessage {
return message._getType() === "generic";
}
}
You can find this in the langchain-core/src/messages/index.ts file.
I found a similar issue in the langchainjs repository: message._getType is not a function. The suggested solution was to include returnMessages: true
when initializing the AgentExecutorWithOptions
. Here is the corrected code:
memory: history
? new BufferMemory({
returnMessages: true,
memoryKey: "chat_history",
inputKey: "input",
outputKey: "output",
chatHistory: history,
})
: undefined,
Please try this solution and let me know if it resolves your issue. If not, could you provide more details about your implementation? Specifically, how you're initializing and using the ConversationalRetrievalQAChain
class. This will help me better understand the context and provide a more accurate solution.
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
If you want to continue the conversation, start your reply with @dosu-bot.
const textSplitter = new RecursiveCharacterTextSplitter({ chunkSize: 1000 });
const model = new ChatOpenAI({
modelName: "gpt-3.5-turbo",
temperature: 0.8,
});
const docs = await textSplitter.createDocuments([transcriptString]);
const vectorStore = await HNSWLib.fromDocuments(docs, new OpenAIEmbeddings());
chain = ConversationalRetrievalQAChain.fromLLM(
model,
vectorStore.asRetriever(),
{
memory: new BufferMemory({
memoryKey: "chat_history", // Must be set to "chat_history"
}),
}
);
const response = await chain.call({
question: initialPrompt,
});
return response;
this code fixed issue
I am getting this error in the YouTube chatbot => [chain/error] [1:chain:ConversationalRetrievalQAChain] [1ms] Chain run error: "chatMessage._getType is not a function"