stackblitz / bolt.new

Prompt, run, edit, and deploy full-stack web applications
https://bolt.new
MIT License
6.2k stars 1.74k forks source link

PROMPT TOO LONG? WHAT DOES THAT MEAN? #1227

Closed samora254 closed 8 hours ago

samora254 commented 1 day ago

Describe the bug

I get an error when issueing a prompt. Error says, "prompt is too long: 202135 tokens > 200000 maximum'

Link to the Bolt URL that caused the error

https://bolt.new/~/sb1-969yan

Steps to reproduce

I am giving normal commands for changes to my app but commands are not accepted. I've checked I still have enough tokens. And my prompts are short paragraphs.

Expected behavior

Effect the changes reqquested

Screen Recording / Screenshot

No response

Platform

Browser name = Chrome
Full version = 130.0.0.0
Major version = 130
navigator.appName = Netscape
navigator.userAgent = Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/130.0.0.0 Safari/537.36
performance.memory = {
  "totalJSHeapSize": 173556632,
  "usedJSHeapSize": 164409892,
  "jsHeapSizeLimit": 4294705152
}
Username = samora254
Chat ID = e36485e1dab6

Additional context

No response

ayalpani commented 17 hours ago

It probably means that you have hit the limits of bolt in regards to project size. I guess bolt tries to send all your source code into the context window of the LLM but if you have just too much source code already in your project, the LLM rejects it.

samora254 commented 12 hours ago

Makes sense. And explains why i was not using many tokens earlier, but once they reached 10 million they got exhausted VERY fast

On Fri, Nov 1, 2024 at 15:30 ayalpani @.***> wrote:

It probably means that you have hit the limits of bolt in regards to project size. I guess bolt tries to send all your source code into the context window of the LLM but if you have just too much source code already in your project, the LLM rejects it.

— Reply to this email directly, view it on GitHub https://github.com/stackblitz/bolt.new/issues/1227#issuecomment-2451797176, or unsubscribe https://github.com/notifications/unsubscribe-auth/AXPPHXS4L66KZ6COI53WQDTZ6NX5LAVCNFSM6AAAAABQ6D67NOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJRG44TOMJXGY . You are receiving this because you authored the thread.Message ID: @.***>

ayalpani commented 9 hours ago

Exactly. Even when you ask bolt to just change a constant value somewhere, it will cost you those 200k input token plus additional output tokens (3-4 times more costly than input token). Somewhere on twitter the bolt team explained that they want to give developers the possibility to restrict the context to code-parts you can manually chose.

But that would also mean that as a user of bolt you'll have to make a lot of decisions and have an actual understanding of the code that bolt created. It will not be the tool anymore with which product managers can work and ordinary people who are not coders like us. Even an experienced developer might have a hard time understanding huge amounts of code created by Bolt.

Alternatively Bolt finds a good strategy of automatically choosing which parts of your code it wants to fit into the context window. I guess unless we get much bigger context windows and cheaper prices, this can not scale beyond a point and we'll have to get in control of the code ourselves. Also I guess increasing the context window leads to worse quality of the LLM answers so there might be a natural limitation. Maybe the next gen of LLMs does not have such limitations anymore.

Anyway, for a huge amount of use cases, bolt's approach will work just fine already today. Small homepages, small tools, scripts, npm packages, ...

Let's see where things are going to, more and more products like bolt will come up and will have different strengths

kc0tlh commented 8 hours ago

@samora254 thanks for your report, and appreciate your patience as we are a small team working to support all of the new users! The context on this error, workaround ideas, and future updates on the R&D we are doing on this issue are being tracked in #1322 going forward so please go subscribe there!

@ayalpani is dead on with everything in that message! We are working to automatically manage the context window better by implementing a multi-agent approach (potentially) so you have multiple AIs, each with a 200k context window, and they work together to help you manage different aspects of the project. This is all R&D at this point, but very exciting.