stackblitz / bolt.new

Prompt, run, edit, and deploy full-stack web applications
https://bolt.new
MIT License
5.18k stars 1.03k forks source link

Provides Incorrect and lazy code #705

Open ruizTechServices opened 2 days ago

ruizTechServices commented 2 days ago

Describe the bug

It messes up and I have to check the code individually. This is biting into my monthly token count and my ability to continue with my project. It produces code like this:

// ... (previous imports remain the same)

export default function FineTuningInterface() {
  // ... (previous state declarations remain the same)

  const handleModelChange = (value: string) => {
    setSelectedModel(value);
  };

  // ... (rest of the component remains the same)

  return (
    <Card className="w-full max-w-md mx-auto">
      {/* ... (previous content remains the same) */}
      <CardContent>
        <div className="space-y-4">
          <div className="space-y-2">
            <Label htmlFor="base-model">Base Model</Label>
            <Select value={selectedModel} onValueChange={handleModelChange}>
              <SelectTrigger id="base-model">
                <SelectValue placeholder="Select base model" />
              </SelectTrigger>
              <SelectContent>
                {BASE_MODELS.map((model) => (
                  <SelectItem key={model} value={model}>
                    {model}
                  </SelectItem>

This is annoying and I cannot seem to be able to make progress efficiently this way. I already paid a year!!!

Link to the Bolt URL that caused the error

https://bolt.new/~/sb1-9q1tdd

Steps to reproduce

  1. be knee deep into some code project
  2. ask for an exceptional addon to the project, like 'help me integrate Supabase into my project'
  3. as the code generates, eventually the llm will become lazy and start outputting code like this:
    
    // ... (previous imports remain the same)

export default function FineTuningInterface() { // ... (previous state declarations remain the same)

const handleModelChange = (value: string) => { setSelectedModel(value); };

// ... (rest of the component remains the same)

return (

{/* ... (previous content remains the same) */}
{BASE_MODELS.map((model) => ( {model} ### Screen Recording / Screenshot _No response_ ### Platform ``` Browser name = Chrome Full version = 129.0.0.0 Major version = 129 navigator.appName = Netscape navigator.userAgent = Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36 performance.memory = { "totalJSHeapSize": 155841975, "usedJSHeapSize": 131285707, "jsHeapSizeLimit": 4294705152 } Username = ruizTechServices Chat ID = 1e1f83896089 ``` ### Additional context _No response_
ruizTechServices commented 2 days ago

If I may suggest, maybe users can have their own system prompt. If not, then the main system prompt must be modified so that tokens aren't expended so much and the lllm produces complete code constantly. If code is correctly generated, leave it alone and do not modify. Currently the agent has me repeating myself multiple times!!!