Open anmolsood21 opened 3 weeks ago
Hey @anmolsood21 thanks for raising the issue. If you can write more explicit reproducibility steps it would be great. From your description I tested the following without exceptions
import instructor
import anthropic
from openai import OpenAI
from typing import Iterable
from pydantic import BaseModel
client = instructor.from_anthropic(
anthropic.Anthropic(),
mode=instructor.Mode.ANTHROPIC_TOOLS
)
class User(BaseModel):
name: str
age: int
users = client.chat.completions.create(
model="claude-3-5-sonnet-20240620",
temperature=0.1,
response_model=Iterable[User],
max_tokens=1024,
stream=False,
messages=[
{
"role": "user",
"content": "Consider this data: Jason is 10 and John is 30.\
Correctly segment it into entitites\
Make sure the JSON is correct",
},
],
)
for user in users:
print(user)
#> name='Jason' age=10
#> name='John' age=30
I think you are instantiating the client the incorrect way, and this may be leading to problems. There is an instructor @classmethod
defined for Anthropic provider. I suggest once again that if the issue persist, you paste the exact code that raises. Cheers.
What Model are you using?
claude-3-5-sonnet-20240620
Describe the bug
When using
ANTHROPIC_TOOLS
mode, the iterable example here (https://python.useinstructor.com/concepts/lists/#extracting-tasks-using-iterable) doesn't work. However switching it to json mode for anthropic makes it work.To Reproduce Follow the example in https://python.useinstructor.com/concepts/lists/#extracting-tasks-using-iterable with an anthropic model in tool calling mode. The entire response is not being returned in chunks when using iterable.
Expected behavior Iterable support should work in tool calling mode for anthropic as well.