Open GALSEN22 opened 1 week ago
preview is NOT working for me as well.
i ran the development setup with docker compose --profile development up
asked ollama to create a boilerplate app, it did work well with qwen2.5-coder-extra-ctx:7b
created extending token limits.
the preview address is https://k03e2io1v3fx9wvj0vr8qd5q58o56n-fkdo--5173--d3acb9e1.local-corp.webcontainer-api.io
i wonder this domain can not be reached...or is incorrect.
I'm having the same issue. The page is created, but there's nothing when I run npx serve --single
(simple html page). Anybody get this working?
Now preview available I Installed Bolt with Docker on Debian server, was trying to create a login page. Had a code, but now preview. Log for this generation: ❯ cat /home/.npm/_logs/2024-11-27T18_02_13_390Z-debug-0.log 0 verbose cli /usr/local/bin/node /usr/local/bin/npm 1 info using npm@10.2.3 2 info using node@v18.20.3 3 timing npm:load:whichnode Completed in 1ms 4 timing config:load:defaults Completed in 2ms 5 timing config:load:file:/usr/local/lib/node_modules/npm/npmrc Completed in 4ms 6 timing config:load:builtin Completed in 4ms 7 timing config:load:cli Completed in 1ms 8 timing config:load:env Completed in 0ms 9 timing config:load:file:/home/project/.npmrc Completed in 5ms 10 timing config:load:project Completed in 37ms 11 timing config:load:file:/home/.npmrc Completed in 18ms 12 timing config:load:user Completed in 18ms 13 timing config:load:file:/usr/local/etc/npmrc Completed in 5ms 14 timing config:load:global Completed in 5ms 15 timing config:load:flatten Completed in 3ms 16 timing config:load:setEnvs Completed in 0ms 17 timing config:load Completed in 74ms 18 timing npm:load:configload Completed in 74ms 19 timing npm:load:mkdirpcache Completed in 0ms 20 timing npm:load:mkdirplogs Completed in 1ms 21 verbose title npm run dev 22 verbose argv "run" "dev" 23 timing npm:load:setTitle Completed in 1ms 24 timing npm:load:display Completed in 1ms 25 verbose logfile logs-max:10 dir:/home/.npm/_logs/2024-11-27T18_02_13_390Z- 26 verbose logfile /home/.npm/_logs/2024-11-27T18_02_13_390Z-debug-0.log 27 timing npm:load:logFile Completed in 13ms 28 timing npm:load:timers Completed in 0ms 29 timing npm:load:configScope Completed in 0ms 30 timing npm:load Completed in 93ms 31 timing command:run Completed in 6ms 32 verbose stack ENOENT: no such file or directory, open '/home/project/package.json' 33 verbose cwd /home/project 34 verbose Linux 5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36 35 verbose node v18.20.3 36 verbose npm v10.2.3 37 error code ENOENT 38 error syscall open 39 error path /home/project/package.json 40 error errno -2 41 error enoent Could not read package.json: Error: ENOENT: no such file or directory, open '/home/project/package.json' 42 error enoent This is related to npm not being able to find a file. 42 error enoent 43 verbose exit -2 44 timing npm Completed in 156ms 45 verbose code -2 46 error A complete log of this run can be found in: /home/.npm/_logs/2024-11-27T18_02_13_390Z-debug-0.log
In project folder I have only /src and components.json and no package.json file
Same here. Natively installed on windows (without docker). I'm using Chrome (131.0.6778.86)
it might just not have packages i would love to see the code and terminal view, maybe try something simple like asking it to make a html, css and js website.
I tested again. After started the project selected the quen2.5-coder7b (with extended parameter numbers), and simply clicked the "Build a todo app in React using Tailwind" link. It looks like the now preview error related to the incorrect code generation of the model. I tested with Chatgpt 4o an dquen2.5-coder-14b and they worked.
@kismonguz please list you environment. ( OS, how you are running this fork, how you are running ollama, any and all commands you ran after cloning it )
I'm using ChatGPT-4o on Windows 11, Node v21.6.0, Chrome 131.0.6778.86, pulling latest from main and ran the below commands and selected the html example. I see no errors that seem related, it just never runs the terminal commands to generate the preview or hides the output completely and fails.
pnpm install && pnpm run dev
Chrome console logs:
enable copy content js called enable_copy.js:256 Object enable_copy.js:291 E.C.P is not enabled, returning client:495 [vite] connecting... chunk-4MH333OH.js?v=b3f806f7:28429 Download the React DevTools for a better development experience: https://reactjs.org/link/react-devtools chunk-OTGPQDCF.js?v=b3f806f7:718 ⚠️ React Router Future Flag Warning: The revalidation behavior after 4xx/5xx
actionresponses is changing in v7. You can use the
v7_skipActionErrorRevalidation` future flag to opt-in early. For more information, see https://reactrouter.com/v6/upgrading/future#v7_skipactionerrorrevalidation.
warnOnce @ chunk-OTGPQDCF.js?v=b3f806f7:718Understand this warningAI
client:614 [vite] connected.
enable_copy.js:10 enable copy content js called
enable_copy.js:256 Object
enable_copy.js:291 E.C.P is not enabled, returning
enable_copy.js:10 enable copy content js called
enable_copy.js:256 Object
enable_copy.js:291 E.C.P is not enabled, returning
:1234/v1/models:1
Failed to load resource: net::ERR_CONNECTION_REFUSEDUnderstand this errorAI
constants.ts:411 Error getting LMStudio models: TypeError: Failed to fetch
at Object.getLMStudioModels [as getDynamicModels] (constants.ts:402:28)
at constants.ts:422:24
at Array.map (
Failed to load resource: net::ERR_CONNECTION_REFUSEDUnderstand this errorAI
constants.ts:411 Error getting LMStudio models: TypeError: Failed to fetch
at Object.getLMStudioModels [as getDynamicModels] (constants.ts:402:28)
at constants.ts:422:24
at Array.map (as
value and it is preloaded intentionally.Understand this warningAI
chunk-4MH333OH.js?v=b3f806f7:7399 Warning: Function components cannot be given refs. Attempts to access this ref will fail. Did you mean to use React.forwardRef()?
Check the render method of SlotClone
.
at _c (http://localhost:5173/app/components/ui/IconButton.tsx:11:3)
at http://localhost:5173/node_modules/.vite/deps/chunk-PSGSTUXP.js?v=b3f806f7:253:11
at http://localhost:5173/node_modules/.vite/deps/chunk-PSGSTUXP.js?v=b3f806f7:234:11
at http://localhost:5173/node_modules/.vite/deps/chunk-PSGSTUXP.js?v=b3f806f7:336:13
at http://localhost:5173/node_modules/.vite/deps/chunk-PSGSTUXP.js?v=b3f806f7:253:11
at http://localhost:5173/node_modules/.vite/deps/chunk-PSGSTUXP.js?v=b3f806f7:234:11
at http://localhost:5173/node_modules/.vite/deps/chunk-PSGSTUXP.js?v=b3f806f7:336:13
at http://localhost:5173/node_modules/.vite/deps/chunk-VJ7WFBVI.js?v=b3f806f7:2053:13
at http://localhost:5173/node_modules/.vite/deps/@radix-ui_react-tooltip.js?v=b3f806f7:230:13
at Provider (http://localhost:5173/node_modules/.vite/deps/chunk-PSGSTUXP.js?v=b3f806f7:86:15)
at Provider (http://localhost:5173/node_modules/.vite/deps/chunk-VJ7WFBVI.js?v=b3f806f7:1948:15)
at Popper (http://localhost:5173/node_modules/.vite/deps/chunk-VJ7WFBVI.js?v=b3f806f7:2045:11)
at Tooltip (http://localhost:5173/node_modules/.vite/deps/@radix-ui_react-tooltip.js?v=b3f806f7:139:5)
at WithTooltip (http://localhost:5173/app/components/ui/Tooltip.tsx:10:3)
at ExportChatButton (http://localhost:5173/app/components/chat/chatExportAndImport/ExportChatButton.tsx:12:3)
at ClientOnly (http://localhost:5173/node_modules/.vite/deps/remix-utils_client-only.js?v=b3f806f7:38:23)
at div
at div
at div
at div
at div
at div
at div
at div
at Provider (http://localhost:5173/node_modules/.vite/deps/chunk-PSGSTUXP.js?v=b3f806f7:86:15)
at TooltipProvider (http://localhost:5173/node_modules/.vite/deps/@radix-ui_react-tooltip.js?v=b3f806f7:95:5)
at http://localhost:5173/app/components/chat/BaseChat.tsx:82:3
at http://localhost:5173/app/components/chat/Chat.client.tsx:119:16
at Chat (http://localhost:5173/app/components/chat/Chat.client.tsx:40:7)
at ClientOnly (http://localhost:5173/node_modules/.vite/deps/remix-utils_client-only.js?v=b3f806f7:38:23)
at div
at Index
at RenderedRoute (http://localhost:5173/node_modules/.vite/deps/chunk-OTGPQDCF.js?v=b3f806f7:413:5)
at Outlet (http://localhost:5173/node_modules/.vite/deps/chunk-OTGPQDCF.js?v=b3f806f7:781:26)
at App
at Layout (http://localhost:5173/app/root.tsx:98:3)
at RenderedRoute (http://localhost:5173/node_modules/.vite/deps/chunk-OTGPQDCF.js?v=b3f806f7:413:5)
at RenderErrorBoundary (http://localhost:5173/node_modules/.vite/deps/chunk-OTGPQDCF.js?v=b3f806f7:373:5)
at DataRoutes (http://localhost:5173/node_modules/.vite/deps/chunk-OTGPQDCF.js?v=b3f806f7:1435:5)
at Router (http://localhost:5173/node_modules/.vite/deps/chunk-OTGPQDCF.js?v=b3f806f7:788:15)
at RouterProvider (http://localhost:5173/node_modules/.vite/deps/chunk-OTGPQDCF.js?v=b3f806f7:1249:5)
at RemixErrorBoundary (http://localhost:5173/node_modules/.vite/deps/chunk-OTGPQDCF.js?v=b3f806f7:2904:5)
at RemixBrowser (http://localhost:5173/node_modules/.vite/deps/chunk-OTGPQDCF.js?v=b3f806f7:4458:46)
printWarning @ chunk-4MH333OH.js?v=b3f806f7:7399Understand this errorAI
logger.ts:84 DEBUG Terminal Starting bolt terminal [0]
logger.ts:84 DEBUG Terminal Starting bolt terminal [1]
logger.ts:84 DEBUG Terminal Attach [terminal_0]
logger.ts:84 DEBUG Terminal Attach [terminal_1]
blitz.fc837ba8.js:19 [Contextify] [WARNING] running source code in new context
(anonymous) @ blitz.fc837ba8.js:19Understand this warningAI
blitz.fc837ba8.js:19 [Contextify] [WARNING] running source code in new context
(anonymous) @ blitz.fc837ba8.js:19Understand this warningAI
7logger.ts:84 DEBUG ActionRunner File written index.html
11logger.ts:84 DEBUG ActionRunner File written styles.css
7logger.ts:84 DEBUG ActionRunner File written script.js
logger.ts:84 DEBUG Chat Finished streaming
logger.ts:84 DEBUG Terminal Starting bolt terminal [0]
logger.ts:84 DEBUG Terminal Starting bolt terminal [1]
logger.ts:84 DEBUG Terminal Starting bolt terminal [0]
logger.ts:84 DEBUG Terminal Starting bolt terminal [1]
headless:1 The resource https://w-corp-staticblitz.com/fetch.worker.fc837ba8.js was preloaded using link preload but not used within a few seconds from the window's load event. Please make sure it has an appropriate as
value and it is preloaded intentionally.Understand this warningAI
injected.js:42 [USO] Button detection aborting due to timeout.`
@kismonguz please list you environment. ( OS, how you are running this fork, how you are running ollama, any and all commands you ran after cloning it )
Here is some details about my test. I currently did it again. I made 2 test with the 7b and 14b models. Details below.
Environment
Test 1 model: quen2.5-extra:7b made from qwen2.5-coder:7b based on this article: https://coleam00.github.io/bolt.new-any-llm/#super-important-note-on-running-ollama-models
FROM qwen2.5-coder:7b
PARAMETER num_ctx 32768
What I did:
Results: test1_source.zip test1_chat.json
Test2 model: quen2.5-extra:14b made from qwen2.5-coder:14b based on this article: https://coleam00.github.io/bolt.new-any-llm/#super-important-note-on-running-ollama-models
FROM qwen2.5-coder:14b
PARAMETER num_ctx 32768
What I did:
Resulst: The result totally different. At the first step of the generation, the install process stucked, because the command is not silent. I manually proceed it with the following answers(I think the AI would chose this options): Select a framework: › React Select a variant: › TypeScript + SWC If I'm not fast enought, I have to manually run the "npm install command" and the "npm run dev" command, but in the end the todo app works.
test2_chat.json test2_source.zip
Final thoughts: Unfortunately I'm a Vue guy, so I don't know exactly the propper React starter setup, but I think, there is some problem with the generated code.
Hope it helps.
can you please try multiply browsers with and with private mode
I tried. Here is my results:
@kismonguz thanks for the testing this will help.
Your not using any addons to the browsers are you?
You didn't change your privacy setting in your OS or browsers did you?
Can you do a discord conversation so you can screen share and try a few things, I would love to tackle this bug?
Looks like we are getting somewhere with the calls now being complete at least.
In the chrome I use some plugins but in incognito mode, or in the other browsers I do not use anything. I think my OS privacy settings mostly the default.
I'm OK with a discord conversation. Is there any public server?
Here i made one real fast: https://discord.gg/adXUP4hw
tiny update: one step forward two steps backward
I switched from ollama to LM Studio nothing else changed. It increased the overall performance using bigger models and no I can access some tweak options inside LM Studio.
But in the other hand, in the bolt app, I constantly get an error when executing "npm run dev" command. And it starts consuming all the memory of the universe (but stopped after reach the limit of my pc). I think it is a different kind of error than the original one.
❯ npm run dev
> todo-app@0.0.0 dev
> vite
VITE v4.5.5 ready in 2430 ms
➜ Local: http://localhost:5173/
➜ Network: use --host to expose
➜ press h to show help
Error: module is not defined
at file:///home/project/postcss.config.js
at async ModuleLoader.import (https://k03e2io1v3fx9wvj0vr8qd5q58o56n-fkdo.w-corp-staticblitz.com/builtins.ddb8d84d.js:154:2688)
at async importDefault (file:///home/project/node_modules/vite/dist/node/chunks/dep-b2890f90.js:36608:18)
at async Object.search (file:///home/project/node_modules/vite/dist/node/chunks/dep-b2890f90.js:29044:38) {
name: 'Failed to load PostCSS config'
}
Node.js v18.20.3
~/project 3s
❯ fatal error: too many writes on closed pipe
goroutine 6 [running]:
runtime.throw({0x9deaa, 0x1e})
runtime/panic.go:1047 +0x3 fp=0x83aed8 sp=0x83aeb0 pc=0x12250003
os.sigpipe()
runtime/os_js.go:144 +0x2 fp=0x83aef0 sp=0x83aed8 pc=0x13b70002
os.epipecheck(...)
os/file_unix.go:224
os.(*File).Write(0x80c020, {0x86c240, 0x2d, 0x40})
os/file.go:183 +0x2d fp=0x83af78 sp=0x83aef0 pc=0x1607002d
main.runService.func1()
github.com/evanw/esbuild/cmd/esbuild/service.go:99 +0x7 fp=0x83afe0 sp=0x83af78 pc=0x1f630007
runtime.goexit()
runtime/asm_wasm.s:399 +0x1 fp=0x83afe8 sp=0x83afe0 pc=0x14070001
created by main.runService
github.com/evanw/esbuild/cmd/esbuild/service.go:97 +0x1e
goroutine 1 [chan receive]:
runtime.gopark(0xb66b0, 0x834178, 0xe, 0x17, 0x2)
runtime/proc.go:381 +0x28 fp=0x897bc8 sp=0x897ba0 pc=0x124c0028
runtime.chanrecv(0x834120, 0x0, 0x1)
runtime/chan.go:583 +0x7f fp=0x897c50 sp=0x897bc8 pc=0x106d007f
runtime.chanrecv1(0x834120, 0x0)
runtime/chan.go:442 +0x2 fp=0x897c78 sp=0x897c50 pc=0x106b0002
github.com/evanw/esbuild/internal/helpers.(*ThreadSafeWaitGroup).Wait(...)
github.com/evanw/esbuild/internal/helpers/waitgroup.go:36
main.runService.func2()
github.com/evanw/esbuild/cmd/esbuild/service.go:114 +0x8 fp=0x897c98 sp=0x897c78 pc=0x1f620008
main.runService(0x1)
github.com/evanw/esbuild/cmd/esbuild/service.go:160 +0x5c fp=0x897de0 sp=0x897c98 pc=0x1f60005c
main.main()
github.com/evanw/esbuild/cmd/esbuild/main.go:240 +0x9e fp=0x897f88 sp=0x897de0 pc=0x1f59009e
runtime.main()
runtime/proc.go:250 +0x32 fp=0x897fe0 sp=0x897f88 pc=0x12460032
runtime.goexit()
runtime/asm_wasm.s:399 +0x1 fp=0x897fe8 sp=0x897fe0 pc=0x14070001
goroutine 2 [force gc (idle)]:
runtime.gopark(0xb6848, 0x3c65d0, 0x11, 0x14, 0x1)
runtime/proc.go:381 +0x28 fp=0x828fb8 sp=0x828f90 pc=0x124c0028
runtime.goparkunlock(...)
runtime/proc.go:387
runtime.forcegchelper()
runtime/proc.go:305 +0x1f fp=0x828fe0 sp=0x828fb8 pc=0x1249001f
runtime.goexit()
runtime/asm_wasm.s:399 +0x1 fp=0x828fe8 sp=0x828fe0 pc=0x14070001
created by runtime.init.5
runtime/proc.go:293 +0x2
goroutine 3 [GC sweep wait]:
runtime.gopark(0xb6848, 0x3c6960, 0xc, 0x14, 0x1)
runtime/proc.go:381 +0x28 fp=0x829798 sp=0x829770 pc=0x124c0028
runtime.goparkunlock(...)
runtime/proc.go:387
runtime.bgsweep(0x82e000)
runtime/mgcsweep.go:278 +0xf fp=0x8297d0 sp=0x829798 pc=0x1179000f
runtime.gcenable.func1()
runtime/mgc.go:178 +0x2 fp=0x8297e0 sp=0x8297d0 pc=0x110d0002
runtime.goexit()
runtime/asm_wasm.s:399 +0x1 fp=0x8297e8 sp=0x8297e0 pc=0x14070001
created by runtime.gcenable
runtime/mgc.go:178 +0x8
goroutine 4 [GC scavenge wait]:
runtime.gopark(0xb6848, 0x3c6ba0, 0xd, 0x14, 0x2)
runtime/proc.go:381 +0x28 fp=0x829f80 sp=0x829f58 pc=0x124c0028
runtime.goparkunlock(...)
runtime/proc.go:387
runtime.(*scavengerState).park(0x3c6ba0)
runtime/mgcscavenge.go:400 +0xd fp=0x829fa8 sp=0x829f80 pc=0x1160000d
runtime.bgscavenge(0x82e000)
runtime/mgcscavenge.go:628 +0x4 fp=0x829fd0 sp=0x829fa8 pc=0x11650004
runtime.gcenable.func2()
runtime/mgc.go:179 +0x2 fp=0x829fe0 sp=0x829fd0 pc=0x110c0002
runtime.goexit()
runtime/asm_wasm.s:399 +0x1 fp=0x829fe8 sp=0x829fe0 pc=0x14070001
created by runtime.gcenable
runtime/mgc.go:179 +0xe
goroutine 5 [finalizer wait]:
runtime.gopark(0xb66e8, 0x3e3850, 0x10, 0x14, 0x1)
runtime/proc.go:381 +0x28 fp=0x828738 sp=0x828710 pc=0x124c0028
runtime.runfinq()
runtime/mfinal.go:193 +0x1f fp=0x8287e0 sp=0x828738 pc=0x1104001f
runtime.goexit()
runtime/asm_wasm.s:399 +0x1 fp=0x8287e8 sp=0x8287e0 pc=0x14070001
created by runtime.createfing
runtime/mfinal.go:163 +0xd
goroutine 7 [waiting]:
runtime.gopark(0x0, 0x0, 0x0, 0x0, 0x1)
runtime/proc.go:381 +0x28 fp=0x82af90 sp=0x82af68 pc=0x124c0028
runtime.handleEvent()
runtime/lock_js.go:257 +0x1b fp=0x82afe0 sp=0x82af90 pc=0x10a3001b
runtime.goexit()
runtime/asm_wasm.s:399 +0x1 fp=0x82afe8 sp=0x82afe0 pc=0x14070001
created by runtime.beforeIdle
runtime/lock_js.go:207 +0x1a
goroutine 8 [sleep]:
runtime.gopark(0xb6888, 0x832190, 0x13, 0x13, 0x1)
runtime/proc.go:381 +0x28 fp=0x82b768 sp=0x82b740 pc=0x124c0028
time.Sleep(0x3b9aca00)
runtime/time.go:195 +0x1b fp=0x82b7a0 sp=0x82b768 pc=0x13d6001b
main.runService.func3()
github.com/evanw/esbuild/cmd/esbuild/service.go:124 +0x5 fp=0x82b7e0 sp=0x82b7a0 pc=0x1f610005
runtime.goexit()
runtime/asm_wasm.s:399 +0x1 fp=0x82b7e8 sp=0x82b7e0 pc=0x14070001
created by main.runService
github.com/evanw/esbuild/cmd/esbuild/service.go:122 +0x41
Describe the bug
Hi,
I have an issue, When i start a project, after a certain time i don't have anymore the preview and the only message i'm seing is no preview available
Link to the Bolt URL that caused the error
localhost:
Steps to reproduce
I've just launched new instructions for my project
Expected behavior
Show the preview
Screen Recording / Screenshot
Platform
Additional context
No response