Closed idontcare999a closed 1 month ago
I cannot reproduce this.
For me, the chat message is correctly parsed.
Can you check your version of pydantic? It should be: pydantic = "^2.5.2"
Current installed version is pydantic==2.5.2
I get the same error.
Running on google colab, pip install mistralai~=0.0.8
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage
model = "mistral-tiny"
client = MistralClient(api_key=userdata.get('MISTRAL_KEY'))
messages = [
ChatMessage(role="user", content="What is the best French cheese?")
]
# No streaming
chat_response = client.chat(
model=model,
messages=messages,
)
Also tried with the langchain integration and got the same error pip install langchain~=0.1.0 pip install langchain_mistralai
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
[<ipython-input-22-656503716d17>](https://localhost:8080/#) in <cell line: 14>()
12
13 # No streaming
---> 14 chat_response = client.chat(
15 model=model,
16 messages=messages,
2 frames
[/usr/local/lib/python3.10/dist-packages/mistralai/client.py](https://localhost:8080/#) in chat(self, model, messages, temperature, max_tokens, top_p, random_seed, safe_mode)
125 ) -> ChatCompletionResponse:
126 """A chat endpoint that returns a single response.
--> 127
128 Args:
129 model (str): model the name of the model to chat with, e.g. mistral-tiny
[/usr/local/lib/python3.10/dist-packages/mistralai/client_base.py](https://localhost:8080/#) in _make_chat_request(model, messages, temperature, max_tokens, top_p, random_seed, stream, safe_mode)
35 self._api_key = api_key
36 self._logger = logging.getLogger(__name__)
---> 37
38 def _make_chat_request(
39 self,
[/usr/local/lib/python3.10/dist-packages/mistralai/client_base.py](https://localhost:8080/#) in <listcomp>(.0)
35 self._api_key = api_key
36 self._logger = logging.getLogger(__name__)
---> 37
38 def _make_chat_request(
39 self,
AttributeError: 'ChatMessage' object has no attribute 'model_dump'
Ah I see that this was fixed on main, don't get the error anymore with
pip install git+https://github.com/mistralai/client-python
Environment:
Description: Encountered an AttributeError when attempting to use the MistralClient.chat() method with ChatMessage objects. The error suggests that ChatMessage objects do not have a model_dump() method, which is expected by the _make_chat_request method in client_base.py.
Code to Reproduce:
Expected Behavior: The chat() method should correctly process the ChatMessage object and return a chat completion response.
Actual Behavior: An AttributeError is raised:
This occurs at the line in client_base.py where it tries to call model_dump() on each ChatMessage object in the messages list.
Steps to Reproduce: