Open lrq3000 opened 2 months ago
Same error here. To me this looks like an issue on ollama side maybe? Refer to https://github.com/ollama/ollama/issues/5909
Update: As the ollama issue suggests, I tried to change the litellm endpoint:
- ChatModel.LOCAL_LLAMA_3: "ollama_chat/llama3",
+ ChatModel.LOCAL_LLAMA_3: "ollama/llama3.1",
This now results in the following error
[...]
File "/workspace/.venv/lib/python3.12/site-packages/instructor/retry.py", line 173, in retry_sync
raise InstructorRetryException(
instructor.retry.InstructorRetryException: 1 validation error for RelatedQueries
related_questions
Input should be a valid array [type=list_type, input_value='[', input_type=str]
For further information visit https://errors.pydantic.dev/2.9/v/list_type
Looks like the input_value='[' is passed incomplete, instead of '[]' if empty?
A simple fix for this issue (for me) seems to just not use llama3. Gemma2 works fine for me. I did not try phi3.
(Sidenote: I upgraded to python3.12 and pydantic 2.9.0)
Found a workaround to fix the issue... give it a try.
I get the following error when I try to do a search (especially when in Expert mode):
500: {"error":"json: cannot unmarshal string into Go struct field ChatRequest.messages of type api.ToolCallFunctionArguments"}
I am using the Docker approach and ollama and phi3:14b on Windows 11.