rashadphz / farfalle

🔍 AI search engine - self-host with local or cloud LLMs
https://www.farfalle.dev/
Apache License 2.0
2.21k stars 166 forks source link

bug json decoder delimiter error #42

Open nopoz opened 1 month ago

nopoz commented 1 month ago

Build: 0f45bd461c85b08abeca04eb147a804ce69348cc44aa1357908791fa6bb7551a Ollama: 0.1.41

Using ollama gemma I get the following error in the log:

2024-06-02 16:17:13 Traceback (most recent call last):
2024-06-02 16:17:13   File "/workspace/src/backend/chat.py", line 111, in stream_qa_objects
2024-06-02 16:17:13     async for completion in response_gen:
2024-06-02 16:17:13   File "/workspace/.venv/lib/python3.11/site-packages/llama_index/core/llms/callbacks.py", line 280, in wrapped_gen
2024-06-02 16:17:13     async for x in f_return_val:
2024-06-02 16:17:13   File "/workspace/.venv/lib/python3.11/site-packages/llama_index/llms/ollama/base.py", line 408, in gen
2024-06-02 16:17:13     chunk = json.loads(line)
2024-06-02 16:17:13             ^^^^^^^^^^^^^^^^
2024-06-02 16:17:13   File "/usr/local/lib/python3.11/json/__init__.py", line 346, in loads
2024-06-02 16:17:13     return _default_decoder.decode(s)
2024-06-02 16:17:13            ^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-06-02 16:17:13   File "/usr/local/lib/python3.11/json/decoder.py", line 337, in decode
2024-06-02 16:17:13     obj, end = self.raw_decode(s, idx=_w(s, 0).end())
2024-06-02 16:17:13                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2024-06-02 16:17:13   File "/usr/local/lib/python3.11/json/decoder.py", line 353, in raw_decode
2024-06-02 16:17:13     obj, end = self.scan_once(s, idx)
2024-06-02 16:17:13                ^^^^^^^^^^^^^^^^^^^^^^
2024-06-02 16:17:13 json.decoder.JSONDecodeError: Expecting ',' delimiter: line 1 column 1443 (char 1442)
2024-06-02 16:17:13 
2024-06-02 16:17:13 During handling of the above exception, another exception occurred:
2024-06-02 16:17:13 
2024-06-02 16:17:13 Traceback (most recent call last):
2024-06-02 16:17:13   File "/workspace/src/backend/main.py", line 97, in generator
2024-06-02 16:17:13     async for obj in stream_qa_objects(chat_request):
2024-06-02 16:17:13   File "/workspace/src/backend/chat.py", line 140, in stream_qa_objects
2024-06-02 16:17:13     raise HTTPException(status_code=500, detail=detail)
2024-06-02 16:17:13 fastapi.exceptions.HTTPException: 500: Expecting ',' delimiter: line 1 column 1443 (char 1442)

Part way through producing an answer, the UI clears out all details from the search and displays the error:

500: Expecting ',' delimiter: line 1 column 1443 (char 1442) 
ui_error
sebaxzero commented 1 month ago

I believe the issue may be related to Ollama and function calling, as I have encountered this error even when using Mistral with Ollama multiples times.

I have used the same models (Llama3, Mistral, Phi, not Gemma) with LM-Studio with no problems, but I do run into an issue with Llama3 and Phi when they refuse to generate similar queries related to a sensitive topic due to censorship in the model, but uncensored ones got no issue, when that happen i do get a validation error for related queries from the instructor function calling.