mistralai / client-python

Python client library for Mistral AI platform
Apache License 2.0
469 stars 97 forks source link

AttributeError in MistralClient when using ChatMessage with `model_dump()` #18

Closed idontcare999a closed 1 month ago

idontcare999a commented 9 months ago

Environment:

Python version: 3.9
MistralAI package version: 0.0.8

Description: Encountered an AttributeError when attempting to use the MistralClient.chat() method with ChatMessage objects. The error suggests that ChatMessage objects do not have a model_dump() method, which is expected by the _make_chat_request method in client_base.py.

Code to Reproduce:

import os
from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

api_key = 'your-api-key'
model = "mistral-tiny"

client = MistralClient(api_key=api_key)

chat_response = client.chat(
    model=model,
    messages=[ChatMessage(role="user", content="What is the best French cheese?")],
)
print(chat_response.choices[0].message.content)

Expected Behavior: The chat() method should correctly process the ChatMessage object and return a chat completion response.

Actual Behavior: An AttributeError is raised:

AttributeError: 'ChatMessage' object has no attribute 'model_dump'

This occurs at the line in client_base.py where it tries to call model_dump() on each ChatMessage object in the messages list.

Steps to Reproduce:

Initialize a MistralClient with a valid API key.
Attempt to send a chat request using the chat() method with a ChatMessage object.
Observe the AttributeError.
Bam4d commented 9 months ago

I cannot reproduce this.

For me, the chat message is correctly parsed.

Can you check your version of pydantic? It should be: pydantic = "^2.5.2"

idontcare999a commented 9 months ago

Current installed version is pydantic==2.5.2

MoritzLaurer commented 8 months ago

I get the same error.

Running on google colab, pip install mistralai~=0.0.8

from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

model = "mistral-tiny"

client = MistralClient(api_key=userdata.get('MISTRAL_KEY'))

messages = [
    ChatMessage(role="user", content="What is the best French cheese?")
]

# No streaming
chat_response = client.chat(
    model=model,
    messages=messages,
)

Also tried with the langchain integration and got the same error pip install langchain~=0.1.0 pip install langchain_mistralai

---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
[<ipython-input-22-656503716d17>](https://localhost:8080/#) in <cell line: 14>()
     12 
     13 # No streaming
---> 14 chat_response = client.chat(
     15     model=model,
     16     messages=messages,

2 frames
[/usr/local/lib/python3.10/dist-packages/mistralai/client.py](https://localhost:8080/#) in chat(self, model, messages, temperature, max_tokens, top_p, random_seed, safe_mode)
    125     ) -> ChatCompletionResponse:
    126         """A chat endpoint that returns a single response.
--> 127 
    128         Args:
    129             model (str): model the name of the model to chat with, e.g. mistral-tiny

[/usr/local/lib/python3.10/dist-packages/mistralai/client_base.py](https://localhost:8080/#) in _make_chat_request(model, messages, temperature, max_tokens, top_p, random_seed, stream, safe_mode)
     35         self._api_key = api_key
     36         self._logger = logging.getLogger(__name__)
---> 37 
     38     def _make_chat_request(
     39         self,

[/usr/local/lib/python3.10/dist-packages/mistralai/client_base.py](https://localhost:8080/#) in <listcomp>(.0)
     35         self._api_key = api_key
     36         self._logger = logging.getLogger(__name__)
---> 37 
     38     def _make_chat_request(
     39         self,

AttributeError: 'ChatMessage' object has no attribute 'model_dump'
MoritzLaurer commented 8 months ago

Ah I see that this was fixed on main, don't get the error anymore with pip install git+https://github.com/mistralai/client-python