Open LuD1161 opened 1 week ago
For some reason it gets generated in chrome but it still shows as running for the last command - npm run dev
Any ideas ?
Also the preview looks very bad, with the deepseek coder api. is this how it is supposed to be 🤔 ?
I have been struggling with this for the last 24 hours - also using Deepseek and also Anthropic - I have tried running with and without Docker - what I am finding is that when I give an instruction like Create an Astro site it can't complete it.
using terminal and running npm create astro@latest
I get this
project initializing... â–¶ Template copying... â–¡ TypeScript â–¡ Dependencies â–² error Failed to fetch https://api.github.com/repos/withastro/astro/tarball/examples/basics â–² error TypeError: fetch failed â–² error SocketError: other side closed â–² error Unable to download template basics npm ERR! code 1 npm ERR! path /home/project npm ERR! command failed npm ERR! command sh -c create-astro
i am running into same issue. Using Deepseek API. Have tried 3 different browsers. FF, Chrome Canary and Safari. It generates the code, then stuck on "Run Command" which keeps spinning forever. Only displaying white page in preview window. Anyone able get past it? Not sure what i am doing wrong.
I use Linux I don't have any problem but slow to do coding take 1 min at list and only use deepseek chat model not deepseek coder even I select coder some reason
I wasted almost 3 days on the same issues. Tried both IE and Chrom but no help. Also, I tried using different LLM Anthropic, OpenAI, DeepsekCoder, QwenCoder. I tried both docker version and windows version but no help. I did all the setting including Context length for all local model but still no help. I formatted my entire windows and did clean install of ollama and bolt.new but still no help. I doubt if this is really working for anyone? If yes, suggest me what exactly required to make it work. I did all in github, watched videos and followed every step but no help.
I'm having the same issue but so far it's due to my model half applying actions and half just responding to messages. 90% of the time with deepseek it doesn't even control the codebase. Personally with my issue I have dependencies that aren't being installed and when i manually install they get deleted again. because the commands arent being executed im getting ports opened and they wont close unless i close bolt, and kill the processes manually. with xAI i can get one or two commands in before it hangs. nothing at all with deepseek.
A workaround fix for me has been to tell the LLM specifically not to use ANY commands which are persistent (npm start, node for server, nodemon etc) and that i would start them myself. that allowed me to finally see updated preview. Hope this works for someone!
I've been trying to get this preview to work for the past two weeks. I initially thought it was an error, as it occasionally displayed errors due to web container limitations. However, I just tried the same prompt on the main Bolt.new website, and the preview opens correctly there. So, something isn’t working properly with the preview feature here—I definitely agree.
I just tried one basic prompt got this below:
Sure, I'll create a simple HTML page that displays "Hello, World!".
Simple HTML Page Click to open Workbench
Create index.html You can now open the index.html file in your browser to see the "Hello, World!" message.
yeah doesnt work for me unless the project is simple html no tailwind or anything
Describe the bug
Hello team,
Thanks for the awesome tool. I've a paid API access to DeepSeek API and I am trying to build my app there. However, I am not able to see any preview yet. The files are generated, hence the context size isn't an issue.
Link to the Bolt URL that caused the error
http://localhost:5173/chat/todo-app-react-tailwind
Steps to reproduce
Expected behavior
The preview should be generated.
Screen Recording / Screenshot
Platform
Additional context
No response