Open jedarden opened 7 months ago
Hey @jedarden currently the container always try to connect to Ollama but the UI should fail gracefully when it can't connect. Are you seeing any errors in the UI or interface? If not, you can safely ignore these errors as the UI catches them and just doesn't display the option to use Ollama. If you are seeing an issue with the interface let me know what that is and we can address it, it's likely unrelated to the /tags
request error.
I guess the confusion here is where you're seeing:
Unexpected Application Error! 404 Not Found
Is that just what you're seeing in the logs or is the UI not rendering for you?
UI Wasn't rendering and just showing that text.
Ended up using a different tool to figure out how to change the typescript.
Changed line 125 in HtmlAnnotator.tsx so hardcode my domain as iframeSrc.
So instead of:
const iframeSrc =
import.meta.env.MODE === 'hosted' ||
document.location.hostname.endsWith('github.dev')
? 'https://wandb.github.io'
: 'http://127.0.0.1:7878'
It now reads
const iframeSrc ='https://openui.mydomain.com'
This fixes the iframe issue where the URL was http://127.0.0.1:7878/openui/index.html?buster=113
.
However, I don't really know how to make this front end component dynamic so that it could be deployed to any environment.
It think we should change that line to:
const iframeSrc = document.location.hostname.includes("127.0.0.1") ? 'http://127.0.0.1:7878' : 'https://wandb.github.io'
That will always use the github hosted domain when its not running on localhost which is generally just for development. You don't actually want it on the same domain as the parent page for security reasons. It means a user can run arbitrary javascript on your domain which is bad. Using the github.io domain sandboxes the iframe so it's more secure.
I just merged the fix mentioned into main
. Let me know if that didn't work.
A quick note. When I tried using the most recent commit, this log output was presented:
wandb: Unpatching OpenAI completions
INFO (openui): Starting OpenUI AI Server created by W&B...
INFO (openui): Running API Server
INFO (uvicorn.error): Started server process [1]
INFO (uvicorn.error): Waiting for application startup.
DEBUG (openui): Starting up server in 1...
INFO (uvicorn.error): Application startup complete.
INFO (uvicorn.error): Uvicorn running on http://127.0.0.1:7878 (Press CTRL+C to quit)
The last line in particular is listening on 127.0.0.1 in a kubernetes environment, which means connections from the pod are the only ones accepted.
I did a blunt replacement of 127.0.0.1 with 0.0.0.0 to get this part working.
The new log output is:
wandb: Unpatching OpenAI completions
INFO (openui): Starting OpenUI AI Server created by W&B...
INFO (openui): Running API Server
INFO (uvicorn.error): Started server process [1]
INFO (uvicorn.error): Waiting for application startup.
DEBUG (openui): Starting up server in 1...
INFO (uvicorn.error): Application startup complete.
INFO (uvicorn.error): Uvicorn running on http://0.0.0.0:7878 (Press CTRL+C to quit)
When I tried generating the output, I get this:
The iframe url was 'http://0.0.0.0:7878/openui/index.html?buster=113'
Maybe from when I override the 127.0.0.1
.
@jedarden apologies for the delay. Definitely don't replace 127.0.0.1 in the codebase. Instead you can set the OPENUI_ENVIRONMENT=production environment variable and the server will listen on 0.0.0.0 instead of 127.0.0.1, you can see that logic here.
Also, when I last pushed the fix I had forgotten to build the frontend so definitely pull the latest main branch.
httpx.ConnectError: All connection attempts failed
代码是最新拉取的 但依旧有这个问题
File "/app/openui/server.py", line 301, in ollama_models return await ollama.list() ^^^^^^^^^^^^^^^^^^^ File "/venv/lib/python3.12/site-packages/ollama/_client.py", line 629, in list response = await self._request('GET', '/api/tags') ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/venv/lib/python3.12/site-packages/ollama/_client.py", line 351, in _request response = await self._client.request(method, url, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/venv/lib/python3.12/site-packages/httpx/_client.py", line 1574, in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/venv/lib/python3.12/site-packages/httpx/_client.py", line 1661, in send response = await self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/venv/lib/python3.12/site-packages/httpx/_client.py", line 1689, in _send_handling_auth response = await self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/venv/lib/python3.12/site-packages/httpx/_client.py", line 1726, in _send_handling_redirects response = await self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/venv/lib/python3.12/site-packages/httpx/_client.py", line 1763, in _send_single_request response = await transport.handle_async_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 372, in handle_async_request with map_httpcore_exceptions(): File "/usr/local/lib/python3.12/contextlib.py", line 158, in exit self.gen.throw(value) File "/venv/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: All connection attempts failed
INFO (openui): Starting OpenUI AI Server created by W&B... INFO (openui): Running API Server INFO (uvicorn.error): Started server process [1] INFO (uvicorn.error): Waiting for application startup. DEBUG (openui): Starting up server in 1... INFO (uvicorn.error): Application startup complete. INFO (uvicorn.error): Uvicorn running on http://0.0.0.0:7878 (Press CTRL+C to quit) INFO (uvicorn.access): 172.17.0.1:35394 - "GET / HTTP/1.1" 200 INFO (uvicorn.access): 172.17.0.1:35394 - "GET /v1/session HTTP/1.1" 200 ERROR (openui): Server Error: All connection attempts failed
What's the correct way to stand up this container so it's only dependent on OpenAI?
Here's what I tried:
In /frontend, I ran:
Then I copied everything in
/frontend/dist
to/backend/assets/dist
After that, I stood up a kubernetes deployment, service, etc for the container image using the following env variables:
Once the container is fully deployed, I go to
https://openui.mydomain.com
where I'm met with this error:Relevant logs of the backend container:
Looking through the inspector suggests that after
GET /v1/session HTTP/1.1
should be something along the lines ofGET /tags
And I'm assuming that's what's causing the
404 Not Found
error.