rashadphz / farfalle

🔍 AI search engine - self-host with local or cloud LLMs
https://www.farfalle.dev/
Apache License 2.0
2.77k stars 246 forks source link

Error: 500 - Request URL is missing an 'http://' or 'https://' protocol #38

Closed hirowa closed 5 months ago

hirowa commented 6 months ago

Description:

When attempting to make a request, I encountered a 500 error.

Error Message:

"500: Request URL is missing an 'http://' or 'https://' protocol."

Environment:

Additional Context:

Please let me know if further details are required.

Verfinix commented 6 months ago

I have the same error. Below is log from the backend

2024-06-02 20:22:35 INFO: 172.18.0.1:63658 - "GET / HTTP/1.1" 404 Not Found 2024-06-02 20:22:35 INFO: 172.18.0.1:63658 - "GET /favicon.ico HTTP/1.1" 404 Not Found 2024-06-02 20:22:41 INFO: 172.18.0.1:57224 - "OPTIONS /chat HTTP/1.1" 200 OK 2024-06-02 20:22:41 INFO: 172.18.0.1:57238 - "POST /chat HTTP/1.1" 200 OK 2024-06-02 20:22:43 Traceback (most recent call last): 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 69, in map_httpcore_exceptions 2024-06-02 20:22:43 yield 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 373, in handle_async_request 2024-06-02 20:22:43 resp = await self._pool.handle_async_request(req) 2024-06-02 20:22:43 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/httpcore/_async/connection_pool.py", line 167, in handle_async_request 2024-06-02 20:22:43 raise UnsupportedProtocol( 2024-06-02 20:22:43 httpcore.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol. 2024-06-02 20:22:43 2024-06-02 20:22:43 The above exception was the direct cause of the following exception: 2024-06-02 20:22:43 2024-06-02 20:22:43 Traceback (most recent call last): 2024-06-02 20:22:43 File "/workspace/src/backend/chat.py", line 111, in stream_qa_objects 2024-06-02 20:22:43 async for completion in response_gen: 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/llama_index/core/llms/callbacks.py", line 280, in wrapped_gen 2024-06-02 20:22:43 async for x in f_return_val: 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/llama_index/llms/ollama/base.py", line 401, in gen 2024-06-02 20:22:43 async with client.stream( 2024-06-02 20:22:43 File "/usr/local/lib/python3.11/contextlib.py", line 210, in aenter 2024-06-02 20:22:43 return await anext(self.gen) 2024-06-02 20:22:43 ^^^^^^^^^^^^^^^^^^^^^ 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/httpx/_client.py", line 1617, in stream 2024-06-02 20:22:43 response = await self.send( 2024-06-02 20:22:43 ^^^^^^^^^^^^^^^^ 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/httpx/_client.py", line 1661, in send 2024-06-02 20:22:43 response = await self._send_handling_auth( 2024-06-02 20:22:43 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/httpx/_client.py", line 1689, in _send_handling_auth 2024-06-02 20:22:43 response = await self._send_handling_redirects( 2024-06-02 20:22:43 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/httpx/_client.py", line 1726, in _send_handling_redirects 2024-06-02 20:22:43 response = await self._send_single_request(request) 2024-06-02 20:22:43 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/httpx/_client.py", line 1763, in _send_single_request 2024-06-02 20:22:43 response = await transport.handle_async_request(request) 2024-06-02 20:22:43 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 372, in handle_async_request 2024-06-02 20:22:43 with map_httpcore_exceptions(): 2024-06-02 20:22:43 File "/usr/local/lib/python3.11/contextlib.py", line 158, in exit 2024-06-02 20:22:43 self.gen.throw(typ, value, traceback) 2024-06-02 20:22:43 File "/workspace/.venv/lib/python3.11/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions 2024-06-02 20:22:43 raise mapped_exc(message) from exc 2024-06-02 20:22:43 httpx.UnsupportedProtocol: Request URL is missing an 'http://' or 'https://' protocol. 2024-06-02 20:22:43 2024-06-02 20:22:43 During handling of the above exception, another exception occurred: 2024-06-02 20:22:43 2024-06-02 20:22:43 Traceback (most recent call last): 2024-06-02 20:22:43 File "/workspace/src/backend/main.py", line 97, in generator 2024-06-02 20:22:43 async for obj in stream_qa_objects(chat_request): 2024-06-02 20:22:43 File "/workspace/src/backend/chat.py", line 140, in stream_qa_objects 2024-06-02 20:22:43 raise HTTPException(status_code=500, detail=detail) 2024-06-02 20:22:43 fastapi.exceptions.HTTPException: 500: Request URL is missing an 'http://' or 'https://' protocol. 2024-06-02 20:22:43

Even with Groq key, it also having issue.

rashadphz commented 6 months ago

Did you modify your OLLAMA_HOST environment variable at all?

hirowa commented 6 months ago

I had it pointing to 0.0.0.0. Deleted that so it had the default host now. Now the error showing is "500: All connection attempts failed"

image

Verfinix commented 5 months ago

Can advise what to change if I want to point to a remote ollama host ?

Can I just change the line below

  - OLLAMA_HOST=${OLLAMA_HOST:-http://172.16.66.201:11434}

Full docker-compose.dev.yaml as below

services: backend: build: context: . dockerfile: ./src/backend/Dockerfile restart: always ports:

networks: searxng:

rashadphz commented 5 months ago

Hey, I updated the docker-compose and added a .env-template. This custom setup should be more clear and flexible now. The new instructions are in the README. Let me know if you have any problems setting this up!

You should be able to modify OLLAMA_HOST in your .env.