Closed thecodacus closed 1 week ago
This is incredible, thank you so much @thecodacus! We are looking into merging this this weekend!
Ouh, that was on my list of things, great addition!
thanks @wonderwhy-er , i can see there are lots of things that we have in bolt.new website is missing on the repo. These are very needed things in terms of experience. @coleam00 I have also raised another pr for code streaming please check that out as well
Taking this on to review & merge between tonight and tomorrow; already seen some vast improvements locally, and simplicity of not having to use dev tools for the console output.
I’ll reach out if there are edge cases or non-local scenarios I’d like to confirm are working. Thanks so much for giving this the time it deserves 🙌
@thecodacus The repo name for this PR changed to dash-ai, could you update the PR? I'm reviewing based off of this same branch there for now.
@chrismahoney, I have set the name dash-ai from start when I forked it. I don't know if it will cause any issue in here. Shall rename it?
I do see this occasionally, but I believe it's a function of the general prompt not playing well with my local qwen2.5-coder model. Despite the visual it performs the codegen/artifact steps. More an FYI so I can add an issue for this and the Bolt modifications syntax later.
thats weird, maybe addition of one extra command sometimes confusing the llm
Oh okay, probably disregard for now. I'll review and if there are merge issues that may just be local for me vs. upstream.
can someone please suggest me on preview. I tried all possible ways mentioned but not able to get preview. Windows, docker, multiple LLMs like OpenAI, Anthropic, Ollama with wen, phi, deepseek, llama. Did max tokens setting as well but not able to get preview. tried reinstalling but no help. It would be great if someone suggest how to get a preview. Thanks in advance
If you’re running a small model, try to verify that preview is intact by running a prompt such as ‘Create a basic Hello World page using HTML5’. I know it’s simple, but if you’re having issues related to the context length of your model, that prompt fits within the context window and should result in a functional preview.
Add Dedicated Terminal for AI Operations
Overview
This PR introduces a dedicated terminal ("Bolt Terminal") for AI operations, replacing the previous approach of using throwaway shell processes. This architectural change improves reliability, provides better process management, and enhances the user experience when executing AI-driven commands.
Key Changes
1. Terminal Architecture
BoltShell
class to manage a persistent terminal session2. Action System Updates
start
action type for application startup operationsActionRunner
to use the dedicated Bolt terminal3. UI Enhancements
4. Process Management
Technical Details
BoltShell Class
The new
BoltShell
class provides:Action System Changes
start
action type specifically for development server operationsTesting
Migration Impact
Future Improvements
Preview
https://github.com/user-attachments/assets/632514d9-537c-4b75-a69e-cf7f47fc605d