Open akshaynmhc opened 3 weeks ago
What model are you using?
I tried using OPEN_AI_LIKE model and preview does not show too
is there specific model requirements to get the preview to work ?
Hi ArulGandhi, I am using Llama 3.1 8b (Groq)
I saw some of the videos on YouTube, most of them are saying preview is bug in self host version of bolt.new
i also cant seem to get the preview to work in ollama models. i tried deepseek-coder-v2 and it shows code like normal chatgpt code in the chat but no creating the preview on the right and also it doesn't even make the files. Is this ollama oriented or ?
I've had some issues on and off getting previewed operate properly however it seemed to relate to getting any server implementation to return dynamic content rather than simply rendering an HTML/client component/page. It's possible that there's a prompt engineering problem here in that a node based script is not being launched prior to preview being viewed. It will be very helpful here to simplify the test example down to the minimum viable example, ie: basic, HTML page generation, or simple, express server, launch, etc.
remove "type": "module", line in package.json and save the file and run the npx commands, you should see the preview
same issue here, with chrome v 130.0.6723.92 and the mistral large language model. Tried several different prompt for building react applications, all show "No preview Available" in the preview tab.
Preview does seem to be taking a bit longer to load, but ollama and deep seek all seem to work fine on my end ngl. I'd recommend upgrading to chrome 131 to fix a vite bug and the rest (the times I've experienced issues with the webcontainer) always seems to have been the model's fault, not understanding the prompt or similar, random hallucination issues. Don't think there's a problem with the app. Try changing the model, if it works, try again with the model you were having trouble with and if it still doesn't work, it's the model's fault. 👍
I can't test groq since I dont have an api key so it might also be a groq model problem. (I've seen model differences provided to provider). Using deepsea chat vs openrouter, I got vastly diff responses asking which one was greater 9.11 or 9.9. Deepseek chat was correct
I may be wrong, feel free to correct me.
Describe the bug
⚠️Preview doesn't work and Container issue running code online IDE
My Prompt : Build a YouTube clone with a modern UI using React and Tailwind CSS. Include core features like:
Please use the YouTube API for real data. Focus on clean code and performance.
Response I got
Preview not working for any apps that's built
I tested same prompt with bolt.new live version, it works smooth, not sure If I am doing anything correct or this version is glitched?
Thank you
Link to the Bolt URL that caused the error
http://localhost:5173/chat/youtube-clone
Steps to reproduce
Just prompt this same
My Prompt : Build a YouTube clone with a modern UI using React and Tailwind CSS. Include core features like:
Please use the YouTube API for real data. Focus on clean code and performance.
Expected behavior
Preview not working for any apps that's built
Screen Recording / Screenshot
No response
Platform
Additional context
No response