BrainBlend-AI / atomic-agents

Building AI agents, atomically
MIT License
737 stars 62 forks source link

Claude not working #28

Closed ivan-saorin closed 7 hours ago

ivan-saorin commented 7 hours ago

I adapted the chat with followup questions example to use Anthropic Claude.

The modification are straight forward `# client = instructor.from_openai(openai.OpenAI()) client = instructor.from_anthropic(anthropic.Anthropic())

custom_system_message = SystemPromptGenerator( background=['You are an helpful assistant that can only answer about Star Wars facts.'], steps=[ "Analyze the user's question and determine is It is about Star Wars.", "If It is, provide a fact about Star Wars.", "If It's not, apologize and say you can only answer about Star Wars." ], output_instructions=["Your output should always be presented in the style of Sherlock Holmes"] ) agent = BaseAgent( config=BaseAgentConfig( client=client,

model='gpt-4o-mini',

model='claude-3-5-haiku-20241022',
system_prompt_generator=custom_system_message,
output_schema=StarWarsFactsOutputSchema
)

)`

However I receive an error that is not present in the OpenAI version.

...\instructor\retry.py", line 164, in retry_sync raise InstructorRetryException( instructor.exceptions.InstructorRetryException: Missing required arguments; Expected either ('max_tokens', 'messages' and 'model') or ('max_tokens', 'messages', 'model' and 'stream') arguments to be given

KennyVaneetvelde commented 7 hours ago

Hmm I don't immediately see anything wrong but judging from the error message, you are missing a parameter, or it is an actual bug

Either way, the error is coming from https://github.com/instructor-ai/instructor and not from Atomic Agents, so I'd ask around there!

Good luck!

ivan-saorin commented 1 hour ago

The problem is client.chat.completions.create need the parameter max_tokens with Claude's LLMs to work properly. But atomic-agents do not provide the possibility to pass that parameter in Its BaseAgentConfig class, hence, as a result, Claude will never work with atomic-agents.

I will try to fix It on my fork.

Thank you very much :)