Closed markbrule closed 2 days ago
Just as a follow up, I tried the ChatOpenAI client on the same data and it worked fine.
The error is returned by https://api.mistral.ai/v1/chat/completions.
Please check if the messages
parameter meets the requirements(According to the Mistral.ai API documentation, the prompt role sequence has specific requirements).
You can find the documentation at https://docs.mistral.ai/api/
I hope this helps.
According to the mistral API, ToolMessage
is the final calling message from the llm. If we pass other messages like SystemMessage
or HumanMessage
afterwards, it would through an error.
I'm trying to create an agent through langGraph and I encountered the same error.
I'm still figuring out any other way....but I literally don't see anything except clearing the previous messages. ☹️
You need to make a separate Mistral API call after the ToolMessage
is called.
That's what Gemini said. e.g. :
Mistral API 1st call
[SystemMessage(...), HumanMessage(...), ToolMessage(...)]
Mistral API 2nd call:
[HumanMessage("tool message content"), AIMessage(...)]
URL
https://python.langchain.com/v0.2/docs/how_to/extraction_examples/
Checklist
Issue with current documentation:
I was having a problem with using ChatMistralAI for extraction, with examples, so I went and followed the how-to page exactly. Without examples it works fine, but when I add the examples as described here:
https://python.langchain.com/v0.2/docs/how_to/extraction_examples/#with-examples-
I get the following error:
HTTPStatusError: Error response 400 while fetching https://api.mistral.ai/v1/chat/completions: {"object":"error","message":"Unexpected role 'user' after role 'tool'","type":"invalid_request_error","param":null,"code":null}
Idea or request for content:
No response