Pythagora-io / gpt-pilot

The first real AI developer
Other
29.06k stars 2.91k forks source link

[Bug]: AttributeError: 'NoneType' object has no attribute 'steps' #996

Open niklasfink opened 4 weeks ago

niklasfink commented 4 weeks ago

Version

VisualStudio Code extension

Operating System

MacOS

What happened?

Pythagora abruptly stopped with the following error:

Stopping Pythagora due to error:

File `core/cli/main.py`, line 38, in run_project
    success = await orca.run()
File `core/agents/orchestrator.py`, line 64, in run
    response = await agent.run()
File `core/agents/developer.py`, line 87, in run
    return await self.breakdown_current_iteration()
File `core/agents/developer.py`, line 153, in breakdown_current_iteration
    self.set_next_steps(response, source)
File `core/agents/developer.py`, line 241, in set_next_steps
    for step in response.steps
AttributeError: 'NoneType' object has no attribute 'steps'

Using Pythagora v0.2.0 / GPT Pilot v0.2.1. There was no error from the LLM API.

Just before that, the Developer Agent was returning (I escaped with \ ):

Breaking down the current task iteration ...

Figuring out which project files are relevant for the next task ...

\```json
{
  "relevant_files": [
    "views/customers.ejs",
    "public/js/customers.js",
    "models/Customer.js",
    "routes/customerRoutes.js",
    "views/partials/_header.ejs",
    "views/partials/_footer.ejs",
    "views/partials/_head.ejs"
  ]
}
\```

\```json
{
  "steps": [
    {
      "type": "command",
      "command": {
        "command": "npm install dotenv",
        "timeout": 60,
        "success_message": "dotenv package installed successfully."
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "server.js",
        "content": "require('dotenv').config();\n" + require('fs').readFileSync('server.js', 'utf8')
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "package.json",
        "content": JSON.stringify({
          ...require('./package.json'),
          scripts: {
            ...require('./package.json').scripts,
            start: "nodemon server.js"
          }
        }, null, 2)
      }
    },
    {
      "type": "command",
      "command": {
        "command": "npm start",
        "timeout": 60,
        "success_message": "Application started successfully using npm start."
      }
    }
  ]
}
\```

\```json
{
  "steps": [
    {
      "type": "command",
      "command": {
        "command": "npm install dotenv",
        "timeout": 60,
        "success_message": "dotenv package installed successfully."
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "server.js",
        "content": "require('dotenv').config();\n" + require('fs').readFileSync('server.js', 'utf8')
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "package.json",
        "content": JSON.stringify({
          ...require('./package.json'),
          scripts: {
            ...require('./package.json').scripts,
            start: "nodemon server.js"
          }
        }, null, 2)
      }
    },
    {
      "type": "command",
      "command": {
        "command": "npm start",
        "timeout": 60,
        "success_message": "Application started successfully using npm start."
      }
    }
  ]
}
\```

\```json
{
  "steps": [
    {
      "type": "command",
      "command": {
        "command": "npm install dotenv",
        "timeout": 60,
        "success_message": "dotenv package installed successfully."
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "server.js",
        "content": "require('dotenv').config();\n" + require('fs').readFileSync('server.js', 'utf8')
      }
    },
    {
      "type": "save_file",
      "save_file": {
        "path": "package.json",
        "content": "{\n  \"name\": \"tt0\",\n  \"version\": \"1.0.0\",\n  \"main\": \"server.js\",\n  \"scripts\": {\n    \"start\": \"nodemon server.js\"\n  },\n  \"dependencies\": {\n    \"express\": \"^4.17.1\",\n    \"mongoose\": \"^5.10.9\",\n    \"bcrypt\": \"^5.0.0\",\n    \"express-session\": \"^1.17.1\",\n    \"ejs\": \"^3.1.5\",\n    \"bootstrap\": \"^5.3.2\",\n    \"nodemailer\": \"^6.4.11\",\n    \"pdfkit\": \"^0.11.0\",\n    \"dotenv\": \"^8.2.0\"\n  }\n}"
      }
    },
    {
      "type": "command",
      "command": {
        "command": "npm start",
        "timeout": 60,
        "success_message": "Application started successfully using npm start."
      }
    }
  ]
}
\```
senko commented 4 weeks ago

Thanks for reporting this @niklasfink.

Which LLM is that?

It looks like the LLM was giving inorrect (invalid JSON) response. Pythagora retried twice, but failed parsing the JSON all three times (if you look at the output you pasted, none of the tree is valid JSON, which leads me to believe it's some local model that has trouble specifying the correct JSON), and just return None.

So I don't think the cause of the error is in Pythagora, but I'd still like to keep this bug open as we should handle that edge case in a more graceful way.

niklasfink commented 4 weeks ago

Hi @senko, it’s GPT-4o on Azure OpenAI. From my perspective it’s important to handle these cases, as otherwise Pythagora needs to be restarted and all progress since the last task is lost.

nikzart3 commented 3 weeks ago

Hi @senko, it’s GPT-4o on Azure OpenAI. From my perspective it’s important to handle these cases, as otherwise Pythagora needs to be restarted and all progress since the last task is lost.

Hey, can you tell me how you got it to work with azure open ai api?

maxfinnsjo commented 3 weeks ago
[Pythagora] Stopping Pythagora due to error:

File `core/cli/main.py`, line 38, in run_project
    success = await orca.run()
File `core/agents/orchestrator.py`, line 66, in run
    response = await agent.run()
File `core/agents/external_docs.py`, line 48, in run
    selected_docsets = await self._select_docsets(available_docsets)
File `core/agents/external_docs.py`, line 95, in _select_docsets
    return {k: available_docsets[k] for k in llm_response.docsets}
File `core/agents/external_docs.py`, line 95, in <dictcomp>
    return {k: available_docsets[k] for k in llm_response.docsets}
KeyError: 'express'

using openai api with gpt3-turbo