Closed eaalmousa closed 3 hours ago
Appreciate the feedback! This is a known issue with the AI model’s context window size and is being tracked here: #1322. That page also contains information on workarounds and ongoing development regarding this issue. Hope this helps!
Describe the bug
I encountered an issue where Bolt showed the error message: Prompt is too long: 201106 tokens > 200000 maximum. This occurred when I tried to preview my project after importing it successfully. The tool appears to have a token limit for prompts, and my project exceeds that limit.
Link to the Bolt URL that caused the error
https://bolt.new/~/sb1-fjf9uw
Steps to reproduce
Go to the Bolt project import interface. Import the project from my source. Follow the prompts to install dependencies using npm install. Start the application using npm run dev. Attempt to preview the project. Observe the error message about the token limit
Expected behavior
I expected Bolt to process the project preview without hitting a token limit or provide a way to manage or optimize the project to fit within the limit.
Screen Recording / Screenshot
Platform
Additional context
The project involves a large codebase, and it seems the token count exceeds the maximum limit Bolt supports for prompts. I need guidance on how to handle this situation, such as reducing the token size or splitting the preview into smaller components.