rashadphz / farfalle

🔍 AI search engine - self-host with local or cloud LLMs
https://www.farfalle.dev/
Apache License 2.0
2.7k stars 243 forks source link

500 error: json: cannot unmarshal string into Go struct field ChatRequest.messages of type api.ToolCallFunctionArguments #89

Open lrq3000 opened 2 months ago

lrq3000 commented 2 months ago

I get the following error when I try to do a search (especially when in Expert mode):

500: {"error":"json: cannot unmarshal string into Go struct field ChatRequest.messages of type api.ToolCallFunctionArguments"}

I am using the Docker approach and ollama and phi3:14b on Windows 11.

jandoerntlein commented 1 month ago

Same error here. To me this looks like an issue on ollama side maybe? Refer to https://github.com/ollama/ollama/issues/5909

Update: As the ollama issue suggests, I tried to change the litellm endpoint:

-    ChatModel.LOCAL_LLAMA_3: "ollama_chat/llama3",
+    ChatModel.LOCAL_LLAMA_3: "ollama/llama3.1",

This now results in the following error

[...]
File "/workspace/.venv/lib/python3.12/site-packages/instructor/retry.py", line 173, in retry_sync
    raise InstructorRetryException(
instructor.retry.InstructorRetryException: 1 validation error for RelatedQueries
related_questions
  Input should be a valid array [type=list_type, input_value='[', input_type=str]
    For further information visit https://errors.pydantic.dev/2.9/v/list_type

Looks like the input_value='[' is passed incomplete, instead of '[]' if empty?

A simple fix for this issue (for me) seems to just not use llama3. Gemma2 works fine for me. I did not try phi3.

(Sidenote: I upgraded to python3.12 and pydantic 2.9.0)

jandoerntlein commented 1 month ago

Found a workaround to fix the issue... give it a try.

0001-fix-issues-89-unmarshallling-string-failed.patch