mistralai / client-python

Python client library for Mistral AI platform
Apache License 2.0
477 stars 103 forks source link

Mistral-medium constantly fails with internal server error with this simple script #96

Closed pseudotensor closed 5 months ago

pseudotensor commented 5 months ago
import os

params = {'max_tokens': 1024, 'model': 'mistral-medium', 'random_seed': 13400, 'safe_prompt': False, 'temperature': 0.3, 'top_p': 1.0}
messages = [{'role': 'user', 'content': 'Give an example employee profile.  Make up values if required, do not ask further questions.'}, {'role': 'assistant', 'content': "{'name': 'Henry', 'age': 35, 'skills': ['Python', 'TensorFlow', 'Machine Learning'], 'workhistory': [{'company': 'Google', 'duration': '2015-2020', 'position': 'AI Research Scientist'}, {'company': 'Microsoft', 'duration': '2012-2015', 'position': 'Software Engineer'}]}", 'tool_calls': []}, {'role': 'user', 'content': '\n"""\nHenry is a good AI scientist.\n"""\n\nEnsure your entire response is outputted as strict valid JSON text inside a Markdown code block with the json language identifier.\n\n\nEnsure you follow this JSON schema, and ensure to use the same key names as the schema:\n```json\n{"name": {"type": "string"}, "age": {"type": "integer"}, "skills": {"type": "array", "items": {"type": "string", "maxLength": 10}, "minItems": 3}, "workhistory": {"type": "array", "items": {"type": "object", "properties": {"company": {"type": "string"}, "duration": {"type": "string"}, "position": {"type": "string"}}, "required": ["company", "position"]}}}\n```\n\nGive me another one, ensure it has a totally different name and totally different age.\n'}]

from mistralai.client import MistralClient
from mistralai.models.chat_completion import ChatMessage

api_key = os.environ["MISTRAL_API_KEY"]
model = "mistral-medium"

client = MistralClient(api_key=api_key)

messages = [
    ChatMessage(**x) for x in messages
]

# No streaming
chat_response = client.chat(
    messages=messages,
    **params
)

print(chat_response.choices[0].message.content)
Traceback (most recent call last):
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/mistralai/client.py", line 130, in _request
    yield self._check_response(response)
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/mistralai/client.py", line 71, in _check_response
    self._check_response_status_codes(response)
  File "/home/jon/miniconda3/envs/h2ogpt/lib/python3.10/site-packages/mistralai/client.py", line 49, in _check_response_status_codes
    raise MistralAPIStatusException.from_response(
mistralai.exceptions.MistralAPIStatusException: Status: 500. Message: {"object":"error","message":"Service unavailable.","type":"internal_server_error","param":null,"code":"1000"}
fuegoio commented 5 months ago

Hi :) Thanks for reporting the issue!

It seems that using your snippet I can't reproduce the issue with mistral-medium. Is that still the case?

pseudotensor commented 5 months ago

Don't see it anymore, must have been system issue. Closing