mistralai / client-python

Python client library for Mistral AI platform
Apache License 2.0
469 stars 97 forks source link

Input should be 'stop' or 'length' [type=enum, input_value='error', input_type=str] #74

Closed pseudotensor closed 1 month ago

pseudotensor commented 6 months ago

Seems like mistral is returning an error that is not handled by streaming.

  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 63, in generate_from_stream
    for chunk in stream:
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/langchain_mistralai/chat_models.py", line 310, in _stream
    for chunk in self.completion_with_retry(
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/mistralai/client.py", line 208, in chat_stream
    yield ChatCompletionStreamResponse(**json_streamed_response)
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/pydantic/main.py", line 171, in __init__
    self.__pydantic_validator__.validate_python(data, self_instance=self)
pydantic_core._pydantic_core.ValidationError: 1 validation error for ChatCompletionStreamResponse
choices.0.finish_reason
  Input should be 'stop' or 'length' [type=enum, input_value='error', input_type=str]
varunsingh3000 commented 6 months ago

Hi, I seem to be having the same issue. I'm trying to run the chat completion (without streaming) with the 'mistral-medium-latest' endpoint and I get the above error. But when I try the same with 'mistral-small' everything works.

sophiamyang commented 1 month ago

This should be fixed now. Could you try mistralai 1.0.0 and let us know if you have this issue? Thanks!