coleam00 / bolt.new-any-llm

Prompt, run, edit, and deploy full-stack web applications using any LLM you want!
https://bolt.new
MIT License
3.89k stars 1.59k forks source link

FileSync: Add option to load files from local filesystem #289

Closed mrsimpson closed 6 days ago

mrsimpson commented 1 week ago

What's in here

This PR adds a button to the workbench to load files from the host. This is a step into the direction of keeping local files in sync with the webcontainer inside bolt.

Bildschirmfoto 2024-11-15 um 12 34 41

Checklist

Tradeoffs

The initial idea to improve the function was discarded, since this would increase complexity a lot: What should be done if files have been changed remote and locally within the webcontainer? Therefore, an explicit load was preferred.

wonderwhy-er commented 1 week ago

This is not correct way I think, did you test that it really works? Way Bolt.New works is that it sends messages to AI And AI answers with "files" that then are written in to webcontainer.

If you just add files to container then AI will not see that.

Way it needs to be done is that you add message to the chat with content of files you are adding.

There is older PR that does that here https://github.com/coleam00/bolt.new-any-llm/pull/31

Work there slowed down though but I hope we get to merging it.

mrsimpson commented 6 days ago

I tested by loading files from into a new conversation fromdisc and then asked ai to describe the project. It did properly, thus I assumed the whole filesystem was submitted to the context. 🤷

If it's more appropriate another way, I'll have a look at it.

chrismahoney commented 6 days ago

I do think that the sync behavior needs to be out-of-band of the AI, aside from perhaps the inference step to determine diff between local and webcontainer filesystems. I'll check this out as well to have a better idea of what needs to happen as well for github sync. @wonderwhy-er is also very interested in this functionality as a priority, thanks for submitting this!

mrsimpson commented 6 days ago

Re-tested it. Didn't work. Same workflow as earlier. Looks as if there was some other cache that got transmitted when testing it earlier.

I'll look into how the frontend is propagating the files. Closing in favor of #31 – might reopen it if appropriate lateron

mrsimpson commented 3 days ago

FWIW: I checked the other PRs opened to the subject( #31 and particularly #162 ). I have to admit it's my first real look at the bolt code and I have to admit I'm not a big fan of posting the uploaded messages to the chat (like done in #162).

The reason is that

I therefore, continued passing the information via the store and updated my branch. Since I don't expect this solution to be welcomed (as @wonderwhy-er already said he thinks it should be done via Chat), I don't re-open my PR.

However, I want to point out one thing I noticed when implementing this: Due to a current ... we could call it bug... that <bolt_file_modifications> are currently rendered, uploading many files causes the diff to become very large. As it is fully displayed, this made my UI freeze. I do think that <bolt_file_modifications> should be rendered differently, but I didn't figure out where this should be done.

Anyway: Keep up the gret work, I'm looking forward to the production ready solution!

For the record: This is what https://github.com/mrsimpson/bolt.new-any-llm/tree/load-files currently looks like

https://github.com/user-attachments/assets/36b14feb-cca6-4543-bce9-a0be2d28f1b2

wonderwhy-er commented 3 days ago

Ahm, little bit hard to understand wha exactly you mean as there are couple of things mixed.

  1. Adding files only to WebContainer does not work as AI will not know about such changes, so in some way it should be adde to chat
  2. Does not mean that such messages need to be rendered in UI, it could be some kind of "tool use" message that user does not need to see. Though if you add files or call tools in ChatGPT or similar you do see collapsed element indicating that file is involved

I don't really want to see full diffs in UI chat. But without adding such changes to the chate messages state you are asking AI to get confused and overwrite your added or changed files.

mrsimpson commented 3 days ago

Sorry for not being precise/explicit enough, so I'll try to elaborate a bit:

Adding files only to WebContainer does not work as AI will not know about such changes, so in some way it should be adde to chat .

I am well aware of this. You commented this as first reaction to this very PR ;) Thus, I was looking into how Bolt handles file changes, such as modifying the file in the integrated editor. What it does is that it tracks them in the files-store using #modifiedFiles. Since uploading is similar to manipulating the files in the editor, I think that this is the right place to hook into. Also, the following mechanism with a "diff-message" is appropriate, since a file might as well already be present with different content on the container before being overwritten by a file upload.

You wrote in https://github.com/coleam00/bolt.new-any-llm/pull/31#issuecomment-2479983897 that you think those file modifications should be done via the Chat.client.tsx (this is what I called "posting the uploaded messages to the chat". with chat I meant the Chat component turning those technical messages into assistant-like messages). This is another approach and I tried to explain above why I think this is not the best idea.

Does not mean that such messages need to be rendered in UI, it could be some kind of "tool use" message that user does not need to see.

Absolutely. As said, also if you edit a file in the editor, Bolt creates a <bolt_file_modifications> technical message. This is propagated, but should also not be shown completely (maybe rendered differently).

Here's how I made Bolt see the changes via file upload, if you want to have a look.

Edit: If we wanted to go the route with the #modifiedFiles and the files store, adding another rendering for the diff would definitly be the highest priority ;)