jxnl / instructor

structured outputs for llms
https://python.useinstructor.com/
MIT License
7.52k stars 600 forks source link

MistralAI broken with 1.4.0 #968

Closed ahuang11 closed 2 weeks ago

ahuang11 commented 2 weeks ago

What Model are you using?

Describe the bug A clear and concise description of what the bug is.

Instructor works with MistralAI in 1.3.7, but not 1.4.0.

Failed after retries: <bound method Future.exception of <Future at 0x113f5eec0 state=finished raised AttributeError>>
Traceback (most recent call last):
  File "/Users/ahuang/miniconda3/envs/lumen/lib/python3.10/site-packages/instructor/retry.py", line 242, in retry_async
    return await process_response_async(
  File "/Users/ahuang/miniconda3/envs/lumen/lib/python3.10/site-packages/instructor/process_response.py", line 77, in process_response_async
    model = response_model.from_response(
  File "/Users/ahuang/miniconda3/envs/lumen/lib/python3.10/site-packages/instructor/function_calls.py", line 149, in from_response
    return cls.parse_tools(completion, validation_context, strict)
  File "/Users/ahuang/miniconda3/envs/lumen/lib/python3.10/site-packages/instructor/function_calls.py", line 318, in parse_tools
    message.refusal is None
  File "/Users/ahuang/miniconda3/envs/lumen/lib/python3.10/site-packages/pydantic/main.py", line 828, in __getattr__
    raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}')
AttributeError: 'AssistantMessage' object has no attribute 'refusal'

To Reproduce Steps to reproduce the behavior, including code snippets of the model and the input data and openai response.

import os
import instructor

from pydantic import BaseModel
from mistralai import Mistral

# enables `response_model` in chat call
client = Mistral(api_key=os.getenv("MISTRAL_API_KEY"))

patched_chat = instructor.patch(create=client.chat.complete_async, mode=instructor.Mode.MISTRAL_TOOLS)

class UserDetails(BaseModel):
    name: str
    age: int

resp = await patched_chat(
    model="mistral-large-latest",
    response_model=UserDetails,
    messages=[
        {
            "role": "user",
            "content": f'Extract the following entities: "Jason is 20"',
        },
    ],
)
print(resp)

Expected behavior A clear and concise description of what you expected to happen.

Not crash

Screenshots If applicable, add screenshots to help explain your problem.

ahuang11 commented 2 weeks ago

Duplicate of https://github.com/jxnl/instructor/issues/953