coleam00 / bolt.new-any-llm

Prompt, run, edit, and deploy full-stack web applications using any LLM you want!
https://bolt.new
MIT License
4.09k stars 1.66k forks source link

Failed to spawn shell / creating files #57

Open leex279 opened 1 month ago

leex279 commented 1 month ago

Describe the bug

I installed bolt on my ubuntu 23 server (container within proxmox). It is running so far but it cant spawn a shell and due to this not create files.

Anything I am missing ? I did not see a hint in the doc/readme that there is something to configure. Is it just running on a local maschine and not testet working on a server?

Failed to spawn shell

Failed to execute 'postMessage' on 'Worker': SharedArrayBuffer transfer requires self.crossOriginIsolated

Link to the Bolt URL that caused the error

http://192.168.1.106:8788/chat/todo-app-nextjs-tailwind

Steps to reproduce


Expected behavior

shell working

Screen Recording / Screenshot

image

Platform

Additional context

No response

leex279 commented 1 month ago

Workaround/Fix: I installed caddy as webserver with selfsigned certificate, so I can reach via https. then it works.

80Builder80 commented 1 month ago

Workaround/Fix: I installed caddy as webserver with selfsigned certificate, so I can reach via https. then it works.

Would you be willing to share your installation process? I have been beating my head against a wall trying to get this installed on my Ubuntu server.

leex279 commented 1 month ago

I just installed as mentioned in the readme.

  1. git clone https://github.com/coleam00/bolt.new-any-llm.git
  2. pnpm install (install pnpm if not present "curl -fsSL https://get.pnpm.io/install.sh | sh -")
  3. asked chatgpt to write a script to install and configure caddy: https://chatgpt.com/share/67193069-6cdc-800d-89f7-af80e420717c
  4. configure the .env.local file (in my case openai api key)
  5. pnpm run build
  6. pnpm run start

If I remember correctly that was it :D

nowjon commented 1 month ago

Mine only runs on localhost, how can I expose the application to 0.0.0.0?

leex279 commented 1 month ago

You can do --host 0.0.0.0 as argument, but then I guess you face the same problem I initially reportet here. Thats why I then used a reverse proxy.

nowjon commented 1 month ago

You can do --host 0.0.0.0 as argument, but then I guess you face the same problem I initially reportet here. Thats why I then used a reverse proxy.

Can you try running 'pnpm run dev' and use Google Canary and see if you get that console error? I don't when I run this way, but code still does not generate to the code window on the right, only within the chat. Even after stopping, opening the code window, and then re-sending my prompt

leex279 commented 1 month ago

I think thats exactly the same problem as I had and the error on top. Did you check your DEV-Tools (F12) and or the terminal in bolt? Isnt there the same error as I got?

leex279 commented 1 month ago

here you see => normal chat working, but not the coding part imageimage

nowjon commented 1 month ago

My normal chat works with ollama, but no, I do not get any errors within the terminal. I do get errors within the console, listed below

chunk-TLFX46WQ.js?v=aa9e8abc:7399 Warning: Text content did not match. Server: "Ollama" Client: "Anthropic"
    at option
    at select
    at div
    at ModelSelector (http://localhost:5173/app/components/chat/BaseChat.tsx:33:3)
    at div
    at div
    at div
    at div
    at div
    at _c2 (http://localhost:5173/app/components/chat/BaseChat.tsx:96:3)
    at ClientOnly (http://localhost:5173/node_modules/.vite/deps/remix-utils_client-only.js?v=aa9e8abc:38:23)
    at div
    at Index
    at RenderedRoute (http://localhost:5173/node_modules/.vite/deps/chunk-CKAID7AU.js?v=aa9e8abc:411:5)
    at Outlet (http://localhost:5173/node_modules/.vite/deps/chunk-CKAID7AU.js?v=aa9e8abc:744:26)
    at App
    at Layout (http://localhost:5173/app/root.tsx:98:3)
    at RenderedRoute (http://localhost:5173/node_modules/.vite/deps/chunk-CKAID7AU.js?v=aa9e8abc:411:5)
    at RenderErrorBoundary (http://localhost:5173/node_modules/.vite/deps/chunk-CKAID7AU.js?v=aa9e8abc:371:5)
    at DataRoutes (http://localhost:5173/node_modules/.vite/deps/chunk-CKAID7AU.js?v=aa9e8abc:1395:5)
    at Router (http://localhost:5173/node_modules/.vite/deps/chunk-CKAID7AU.js?v=aa9e8abc:751:15)
    at RouterProvider (http://localhost:5173/node_modules/.vite/deps/chunk-CKAID7AU.js?v=aa9e8abc:1212:5)
    at RemixErrorBoundary (http://localhost:5173/node_modules/.vite/deps/chunk-CKAID7AU.js?v=aa9e8abc:2756:5)
    at RemixBrowser (http://localhost:5173/node_modules/.vite/deps/chunk-CKAID7AU.js?v=aa9e8abc:4314:46)

2
chunk-CKAID7AU.js?v=aa9e8abc:2794 Error: Hydration failed because the initial UI does not match what was rendered on the server.
    at throwOnHydrationMismatch (chunk-TLFX46WQ.js?v=aa9e8abc:16351:17)
    at tryToClaimNextHydratableInstance (chunk-TLFX46WQ.js?v=aa9e8abc:16372:15)
    at updateHostComponent (chunk-TLFX46WQ.js?v=aa9e8abc:21672:13)
    at beginWork (chunk-TLFX46WQ.js?v=aa9e8abc:22813:22)
    at beginWork$1 (chunk-TLFX46WQ.js?v=aa9e8abc:26631:22)
    at performUnitOfWork (chunk-TLFX46WQ.js?v=aa9e8abc:26076:20)
    at workLoopSync (chunk-TLFX46WQ.js?v=aa9e8abc:26015:13)
    at renderRootSync (chunk-TLFX46WQ.js?v=aa9e8abc:25994:15)
    at performSyncWorkOnRoot (chunk-TLFX46WQ.js?v=aa9e8abc:25752:28)
    at flushSyncCallbacks (chunk-TLFX46WQ.js?v=aa9e8abc:15997:30)
chunk-TLFX46WQ.js?v=aa9e8abc:7399 Warning: An error occurred during hydration. The server HTML was replaced with client content in <div>.
6
chunk-TLFX46WQ.js?v=aa9e8abc:14290 Uncaught Error: Text content does not match server-rendered HTML.
    at checkForUnmatchedText (chunk-TLFX46WQ.js?v=aa9e8abc:14290:19)
    at diffHydratedProperties (chunk-TLFX46WQ.js?v=aa9e8abc:14738:21)
    at hydrateInstance (chunk-TLFX46WQ.js?v=aa9e8abc:15489:18)
    at prepareToHydrateHostInstance (chunk-TLFX46WQ.js?v=aa9e8abc:16388:31)
    at completeWork (chunk-TLFX46WQ.js?v=aa9e8abc:23164:23)
    at completeUnitOfWork (chunk-TLFX46WQ.js?v=aa9e8abc:26102:24)
    at performUnitOfWork (chunk-TLFX46WQ.js?v=aa9e8abc:26084:13)
    at workLoopSync (chunk-TLFX46WQ.js?v=aa9e8abc:26015:13)
    at renderRootSync (chunk-TLFX46WQ.js?v=aa9e8abc:25994:15)
    at performSyncWorkOnRoot (chunk-TLFX46WQ.js?v=aa9e8abc:25752:28)
22
chunk-TLFX46WQ.js?v=aa9e8abc:16351 Uncaught Error: Hydration failed because the initial UI does not match what was rendered on the server.
    at throwOnHydrationMismatch (chunk-TLFX46WQ.js?v=aa9e8abc:16351:17)
    at tryToClaimNextHydratableInstance (chunk-TLFX46WQ.js?v=aa9e8abc:16372:15)
    at updateHostComponent (chunk-TLFX46WQ.js?v=aa9e8abc:21672:13)
    at beginWork (chunk-TLFX46WQ.js?v=aa9e8abc:22813:22)
    at beginWork$1 (chunk-TLFX46WQ.js?v=aa9e8abc:26631:22)
    at performUnitOfWork (chunk-TLFX46WQ.js?v=aa9e8abc:26076:20)
    at workLoopSync (chunk-TLFX46WQ.js?v=aa9e8abc:26015:13)
    at renderRootSync (chunk-TLFX46WQ.js?v=aa9e8abc:25994:15)
    at performSyncWorkOnRoot (chunk-TLFX46WQ.js?v=aa9e8abc:25752:28)
    at flushSyncCallbacks (chunk-TLFX46WQ.js?v=aa9e8abc:15997:30)
chunk-TLFX46WQ.js?v=aa9e8abc:21638 Uncaught Error: There was an error while hydrating. Because the error happened outside of a Suspense boundary, the entire root will switch to client rendering.
    at updateHostRoot (chunk-TLFX46WQ.js?v=aa9e8abc:21638:65)
    at beginWork (chunk-TLFX46WQ.js?v=aa9e8abc:22811:22)
    at beginWork$1 (chunk-TLFX46WQ.js?v=aa9e8abc:26631:22)
    at performUnitOfWork (chunk-TLFX46WQ.js?v=aa9e8abc:26076:20)
    at workLoopSync (chunk-TLFX46WQ.js?v=aa9e8abc:26015:13)
    at renderRootSync (chunk-TLFX46WQ.js?v=aa9e8abc:25994:15)
    at recoverFromConcurrentError (chunk-TLFX46WQ.js?v=aa9e8abc:25614:28)
    at performSyncWorkOnRoot (chunk-TLFX46WQ.js?v=aa9e8abc:25757:28)
    at flushSyncCallbacks (chunk-TLFX46WQ.js?v=aa9e8abc:15997:30)
    at flushRoot (chunk-TLFX46WQ.js?v=aa9e8abc:25783:15)
leex279 commented 1 month ago

hm dont know about that. did you try to also just use a reverse proxy?

looks like your server and client are not in sync. the first message tells that you use "Server: "Ollama" Client: "Anthropic"".

nowjon commented 1 month ago

hm dont know about that. did you try to also just use a reverse proxy?

looks like your server and client are not in sync. the first message tells that you use "Server: "Ollama" Client: "Anthropic"".

I did try a reverse proxy, from a different server to a public IP/Let's Encrypt cert, but I no dice, but appears I don't see any errors either image

leex279 commented 1 month ago

maybe its a problem with the ollama. is there also no log on the server where you startet the bolt application?

did you try with openAI or other api-keys instead? (I think I remember that in a youtube video someone mentioned that there are problems with certain models, so not all work)

nowjon commented 1 month ago

maybe its a problem with the ollama. is there also no log on the server where you startet the bolt application?

did you try with openAI or other api-keys instead? (I think I remember that in a youtube video someone mentioned that there are problems with certain models, so not all work)

No errors within the CMD window I started the program from (running on Windows, and can only get the application to run using pnpm run dev). I did not try other providers as I have no need/want to do this unless it's through Ollama I believe the issues mentioned were related to smaller models, so I've been trying larger ones but I only have a 1070 on my Ollama server so primarily codellama:13b

leex279 commented 1 month ago

I dont get it to work with ollama at all because I get cross-origin errors, which is because the client is triggering the ollama modell directly instead of passing it to the server, handled there and sent back. Tried with putting the ollama also behind the caddy reverse proxy and adding origin-headers in the entry.server-file, but did not work :/

Access to fetch at 'https://192.168.1.106:8443/api/tags' from origin 'https://192.168.1.106' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. If an opaque response serves your needs, set the request's mode to 'no-cors' to fetch the resource with CORS disabled.Understand this errorAI _index-B89LtBCH.js:65

ChatGPT attempt to fix it: https://chatgpt.com/share/671abd67-4bd8-800d-94af-1b75f01f9930

HINT: PULL/GET the latest fixed from the git-repo. Just saw there were some fixed today.

PixifyAI commented 4 weeks ago

if you get this to work on ubuntu let me know i tried to set reverse with nginx but no luck. CORS errors on create files and open terminal.

leex279 commented 4 weeks ago

@PixifyAI OpenAI works fine for me, its just my local Ollama what is not working cause of CORS, but yes, if I manage to fix I will report.

aaronbolton commented 3 weeks ago

did anyone manage to fix this?, I'm running bolt production release in a docker container accessing Ollama on another container in the same network, I've put traefik in front of the bolt container but it doesn't list the ollama models, no errors are showing in the docker logs

aaronbolton commented 3 weeks ago

I'm unsure if this is related, but I believe I found the CORS issue.

it's not bolt which is the issue but rather the Ollama config, you need to set CORS Origins to

OLLAMA_ORIGINS=*