coleam00 / bolt.new-any-llm

Prompt, run, edit, and deploy full-stack web applications using any LLM you want!
https://bolt.new
MIT License
3.85k stars 1.58k forks source link

Ollama Models Not Visible When Accessing Bolt Through Remote URL #343

Open joobert opened 2 days ago

joobert commented 2 days ago

Describe the bug

When accessing Bolt through a remote URL, Ollama models are not visible in the web UI, despite both services being individually accessible remotely. The models appear correctly when accessing Bolt through localhost.

Current Configuration

Remote Access Status

Observed Behavior

  1. Working Scenario:

    • Accessing Bolt through localhost shows Ollama models
    • Bolt's localhost instance successfully connects to remote Ollama URL
    • Both services individually accessible through remote URLs
  2. Not Working:

    • When accessing Bolt through remote URL (<remote-ip>:5173), Ollama models are not visible
    • This occurs despite both services being independently accessible remotely

Link to the Bolt URL that caused the error

http://<remote-ip>:5173

Steps to reproduce

  1. Set up Ollama:

    • Install Ollama on the remote machine
    • Set OLLAMA_HOST=0.0.0.0
    • Verify Ollama is running on port 11434
  2. Set up Bolt:

    • Install Bolt using Docker
    • Configure OLLAMA_API_BASE_URL=<remote-ip>:11434
    • Verify Bolt is running on port 5173
  3. Verify Individual Remote Access:

    • Access Ollama API at http://<remote-ip>:11434 - Should respond
    • Access Bolt UI at http://<remote-ip>:5173 - Should load
  4. Test Local Access:

    • Open Bolt through localhost (http://localhost:5173)
    • Verify Ollama models are visible in the UI
  5. Test Remote Access:

    • Open Bolt through remote URL (http://<remote-ip>:5173)
    • Navigate to model selection area
    • Observe that Ollama models are not visible

Expected behavior

Since both services are accessible remotely and the configuration works via localhost, the Ollama models should be visible when accessing Bolt through the remote URL.

Screen Recording / Screenshot

No response

Platform

Additional context

Are there additional configuration requirements for remote-to-remote communication between Bolt and Ollama?

adrianpuiu commented 1 day ago

same here, none of the other models loads anymore. all of the provider lists are empty, after some time they got populated , but took quite a long time for me

creuzerm commented 1 day ago

I am chasing this too.

I know to start it, we want to start it with --host to enable remote access.

pnpm run dev --host

This works fine for me. But when I try to wrap it in a Domain name so I can get SSL (Cloudflare Tunnel) so I can maybe step around the following error:

Failed to spawn bolt shell
Failed to execute 'postMessage' on 'Worker': SharedArrayBuffer transfer requires self.crossOriginIsolated.

I see that we may have a couple of hard coded references that won't let us access the model data.

image

image

So I suspect we need to chase those two items down and make them more relative.

An example of in-network access by the IP address which was supported by the --host flag we still get a misalignment on the URL for a model. We can see that The Ollama model does get read in and populated. image

zhengzhongquan commented 23 hours ago

please use https://domain.com websocket can not use http,but can use localhost do something。