coleam00 / bolt.new-any-llm

Prompt, run, edit, and deploy full-stack web applications using any LLM you want!
https://bolt.new
MIT License
3.93k stars 1.61k forks source link

feat(bolt terminal): added dedicated bolt terminal, and attached to workbench #228

Closed thecodacus closed 1 week ago

thecodacus commented 1 week ago

Add Dedicated Terminal for AI Operations

Overview

This PR introduces a dedicated terminal ("Bolt Terminal") for AI operations, replacing the previous approach of using throwaway shell processes. This architectural change improves reliability, provides better process management, and enhances the user experience when executing AI-driven commands.

Key Changes

1. Terminal Architecture

2. Action System Updates

3. UI Enhancements

4. Process Management

Technical Details

BoltShell Class

The new BoltShell class provides:

Action System Changes

Testing

Migration Impact

Future Improvements

Preview

https://github.com/user-attachments/assets/632514d9-537c-4b75-a69e-cf7f47fc605d

coleam00 commented 1 week ago

This is incredible, thank you so much @thecodacus! We are looking into merging this this weekend!

wonderwhy-er commented 1 week ago

Ouh, that was on my list of things, great addition!

thecodacus commented 1 week ago

thanks @wonderwhy-er , i can see there are lots of things that we have in bolt.new website is missing on the repo. These are very needed things in terms of experience. @coleam00 I have also raised another pr for code streaming please check that out as well

chrismahoney commented 1 week ago

Taking this on to review & merge between tonight and tomorrow; already seen some vast improvements locally, and simplicity of not having to use dev tools for the console output.

I’ll reach out if there are edge cases or non-local scenarios I’d like to confirm are working. Thanks so much for giving this the time it deserves 🙌

chrismahoney commented 1 week ago

@thecodacus The repo name for this PR changed to dash-ai, could you update the PR? I'm reviewing based off of this same branch there for now.

thecodacus commented 1 week ago

@chrismahoney, I have set the name dash-ai from start when I forked it. I don't know if it will cause any issue in here. Shall rename it?

chrismahoney commented 1 week ago

I do see this occasionally, but I believe it's a function of the general prompt not playing well with my local qwen2.5-coder model. Despite the visual it performs the codegen/artifact steps. More an FYI so I can add an issue for this and the Bolt modifications syntax later.

image
thecodacus commented 1 week ago

thats weird, maybe addition of one extra command sometimes confusing the llm

chrismahoney commented 1 week ago

Oh okay, probably disregard for now. I'll review and if there are merge issues that may just be local for me vs. upstream.

veerababumanyam commented 1 week ago

can someone please suggest me on preview. I tried all possible ways mentioned but not able to get preview. Windows, docker, multiple LLMs like OpenAI, Anthropic, Ollama with wen, phi, deepseek, llama. Did max tokens setting as well but not able to get preview. tried reinstalling but no help. It would be great if someone suggest how to get a preview. Thanks in advance

chrismahoney commented 1 week ago

If you’re running a small model, try to verify that preview is intact by running a prompt such as ‘Create a basic Hello World page using HTML5’. I know it’s simple, but if you’re having issues related to the context length of your model, that prompt fits within the context window and should result in a functional preview.