Open andytriboletti opened 3 hours ago
As in this issue your client is wrognfully created.
if you read closely at the docs you will notice that OpenAI client is being leveraged for Ollama
instructor.Instructor
instance in this case. Furthermore, it may not be clear as water but the mode is set as instructor.Mode.JSON
.
For LLMs without tools mode, not overriding the default TOOLS
mode will trigger --> attempt to parse tools responses --> not finding anything --> throwing the error you saw.
Hope this clarifies the issue.
What Model are you using?
Llama3.2
Describe the bug When trying to handle multiple files using a List in my model, I'm getting an error about multiple tool calls.
Code:
Error message: Error: Instructor does not support multiple tool calls, use List[Model] instead
What's the correct way to structure a model to handle multiple files when the error suggests using List[Model]? I'm trying to get multiple files (with filename and content) in a single response, but getting the "multiple tool calls" error.
The error message suggests using List[Model], but I'm already using List[CodeFile]. Would appreciate guidance on the correct way to structure this.
To Reproduce
run this script
Expected behavior It outputs some files.