Open jamilabreu opened 1 week ago
this might be because llama3 returns the ""
empty string as response.
a usual way to handle this is to retry, either via Instructor.chat_completion(.., retry: 3)
, or pattern matching on {:error
and making the recursive api call n times
Running chat_completion on Ollama sometimes works, but mostly returns a "can't be blank" error