Closed samora254 closed 8 hours ago
It probably means that you have hit the limits of bolt in regards to project size. I guess bolt tries to send all your source code into the context window of the LLM but if you have just too much source code already in your project, the LLM rejects it.
Makes sense. And explains why i was not using many tokens earlier, but once they reached 10 million they got exhausted VERY fast
On Fri, Nov 1, 2024 at 15:30 ayalpani @.***> wrote:
It probably means that you have hit the limits of bolt in regards to project size. I guess bolt tries to send all your source code into the context window of the LLM but if you have just too much source code already in your project, the LLM rejects it.
— Reply to this email directly, view it on GitHub https://github.com/stackblitz/bolt.new/issues/1227#issuecomment-2451797176, or unsubscribe https://github.com/notifications/unsubscribe-auth/AXPPHXS4L66KZ6COI53WQDTZ6NX5LAVCNFSM6AAAAABQ6D67NOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDINJRG44TOMJXGY . You are receiving this because you authored the thread.Message ID: @.***>
Exactly. Even when you ask bolt to just change a constant value somewhere, it will cost you those 200k input token plus additional output tokens (3-4 times more costly than input token). Somewhere on twitter the bolt team explained that they want to give developers the possibility to restrict the context to code-parts you can manually chose.
But that would also mean that as a user of bolt you'll have to make a lot of decisions and have an actual understanding of the code that bolt created. It will not be the tool anymore with which product managers can work and ordinary people who are not coders like us. Even an experienced developer might have a hard time understanding huge amounts of code created by Bolt.
Alternatively Bolt finds a good strategy of automatically choosing which parts of your code it wants to fit into the context window. I guess unless we get much bigger context windows and cheaper prices, this can not scale beyond a point and we'll have to get in control of the code ourselves. Also I guess increasing the context window leads to worse quality of the LLM answers so there might be a natural limitation. Maybe the next gen of LLMs does not have such limitations anymore.
Anyway, for a huge amount of use cases, bolt's approach will work just fine already today. Small homepages, small tools, scripts, npm packages, ...
Let's see where things are going to, more and more products like bolt will come up and will have different strengths
@samora254 thanks for your report, and appreciate your patience as we are a small team working to support all of the new users! The context on this error, workaround ideas, and future updates on the R&D we are doing on this issue are being tracked in #1322 going forward so please go subscribe there!
@ayalpani is dead on with everything in that message! We are working to automatically manage the context window better by implementing a multi-agent approach (potentially) so you have multiple AIs, each with a 200k context window, and they work together to help you manage different aspects of the project. This is all R&D at this point, but very exciting.
Describe the bug
I get an error when issueing a prompt. Error says, "prompt is too long: 202135 tokens > 200000 maximum'
Link to the Bolt URL that caused the error
https://bolt.new/~/sb1-969yan
Steps to reproduce
I am giving normal commands for changes to my app but commands are not accepted. I've checked I still have enough tokens. And my prompts are short paragraphs.
Expected behavior
Effect the changes reqquested
Screen Recording / Screenshot
No response
Platform
Additional context
No response