Closed olafgeibig closed 2 weeks ago
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
This issue was closed because it has been stalled for 5 days with no activity.
Description
While investigating my problems how to make LiteLLM use OpenAI compatible APIs with structured output as reported in Issue 1333, I discovered that it started to work when I added the pydantic schema of the model to my prompt although I though that is is already happening under the hood of crewAI. Now I looked at my tracing (Arise Phoenix) and I saw that it is not doing that in the system prompt as it should be.
Pydantic out works, but only if I add the schema to the user prompt. As you can see in my tracing, the template is in the system prompt but not the schema. I'm afraid like that, the agent will loose its job 😂
Steps to Reproduce
Expected behavior
Schema in the system prompt, model adheres to the schema and pydantic output has content.
Screenshots/Code snippets
Operating System
macOS Sonoma
Python Version
3.12
crewAI Version
0.61.0
crewAI Tools Version
n/a
Virtual Environment
Venv
Evidence
System prompt as logged by my tracing
Possible Solution
probably a bug when using non OpenAI models?
Additional context
Add the schema to the user prompt and the pydantic output works: