jxnl / instructor

structured outputs for llms
https://python.useinstructor.com/
MIT License
7.53k stars 600 forks source link

Outdated assert using from_mistral #969

Open ahuang11 opened 2 weeks ago

ahuang11 commented 2 weeks ago

What Model are you using?

Describe the bug A clear and concise description of what the bug is.

Mistral recently released their new APIs. https://github.com/mistralai/client-python?tab=readme-ov-file


File ~/miniconda3/envs/lumen/lib/python3.10/site-packages/instructor/client_mistral.py:35, in from_mistral(client, mode, **kwargs)
     26 def from_mistral(
     27     client: mistralai.client.MistralClient | mistralaiasynccli.MistralAsyncClient,
     28     mode: instructor.Mode = instructor.Mode.MISTRAL_TOOLS,
     29     **kwargs: Any,
     30 ) -> instructor.Instructor | instructor.AsyncInstructor:
     31     assert mode in {
     32         instructor.Mode.MISTRAL_TOOLS,
     33     }, "Mode be one of {instructor.Mode.MISTRAL_TOOLS}"
---> 35     assert isinstance(
     36         client, (mistralai.client.MistralClient, mistralaiasynccli.MistralAsyncClient)
     37     ), "Client must be an instance of mistralai.client.MistralClient or mistralai.async_cli.MistralAsyncClient"
     39     if isinstance(client, mistralai.client.MistralClient):
     40         return instructor.Instructor(
     41             client=client,
     42             create=instructor.patch(create=client.chat, mode=mode),
   (...)
     45             **kwargs,
     46         )

AssertionError: Client must be an instance of mistralai.client.MistralClient or mistralai.async_cli.MistralAsyncClient

To Reproduce Steps to reproduce the behavior, including code snippets of the model and the input data and openai response.

import os
import instructor

from pydantic import BaseModel
from mistralai import Mistral

# enables `response_model` in chat call
client = Mistral(api_key=os.getenv("MISTRAL_API_KEY"))

# patched_chat = instructor.patch(create=client.chat.stream_async, mode=instructor.Mode.MISTRAL_TOOLS)

patched_chat = instructor.from_mistral(client, mode=instructor.Mode.MISTRAL_TOOLS)

class UserDetails(BaseModel):
    name: str
    age: int

resp = await patched_chat(
    model="mistral-small-latest",
    response_model=instructor.Partial[UserDetails],
    messages=[
        {
            "role": "user",
            "content": f'Extract the following entities: "Jason is 20"',
        },
    ],
    stream=True
)
async for message in resp:
    print(message)

Expected behavior A clear and concise description of what you expected to happen.

Not crash

Screenshots If applicable, add screenshots to help explain your problem.

semoal commented 2 weeks ago
from mistralai import Mistral

s = Mistral(
    api_key="x",
)

client = instructor.patch(
    create=s.chat.complete_async,
    mode=instructor.mode.Mode.JSON,
)

response = await client(
    model="mistral-large-latest",
    messages=[],
    response_model=Iterable[Accounts],
)

Meanwhile, you can do it like that