Significant-Gravitas / AutoGPT-Code-Ability

🖥️ AutoGPT's Coding Ability - empowering everyone to build software using AI
MIT License
109 stars 34 forks source link

Add DoccumentationExtractor AIBlock #205

Closed Torantulino closed 4 months ago

Torantulino commented 4 months ago

Instead of sending entire ReadMe in Enhanced Error Message, this PR uses an AI Block to extract information from the docs using an LLM.

This may be simple and a bit crude, but in playground testing this produces good results.

(real) Example Response that would be sent to the LLM on failure to use a module correctly:

To fix the error "ChatCompletion" is not a known member of module "openai", you should use the following code instead:

chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "Say this is a test",
        }
    ],
    model="gpt-3.5-turbo",
)

The documentation shows that the correct way to create a chat completion is by calling client.chat.completions.create() with the appropriate parameters, not openai.ChatCompletion.create().

linear[bot] commented 4 months ago

AGPT-668

Torantulino commented 4 months ago

The code changes look good overall, but I noticed a few things that could be improved:

  1. In ai_extractor.py, the create_item method is empty and just passes. It would be better to either implement the database storage logic or add a TODO comment indicating that it needs to be implemented in the future.

  2. In code_validation.py, there is an empty pass statement in the if not func.function_id: block. It would be helpful to add a comment explaining what should happen in that case or why it's being skipped for now.

  3. In database.py, the get_ids_from_function_id_and_compiled_route function has a lot of nested includes when fetching the function and compiled route. While this may be necessary to get all the required data, it could impact performance if the function is called frequently. Consider optimizing the queries if needed.

  4. The retry.j2 and user.j2 prompt templates in codex/prompts/gpt-4-0125-preview/validate/documentation_extractor/ have the same content. If this is intentional, it's fine, but if they are meant to be different, make sure to update one of them.

  5. The system.j2 prompt template is empty. If it's not needed, consider removing the file. If it's meant to have content, make sure to add it.

  6. It would be helpful to add some comments or docstrings in the get_ids_from_function_id_and_compiled_route function to explain what it does and what the expected inputs and outputs are.

Overall, the changes introduce a new DocumentationExtractor class and update the error enhancement logic to extract relevant documentation using the LLM. The get_ids_from_function_id_and_compiled_route function is added to fetch the necessary IDs for the LLM invocation. The code is well-structured and follows good practices, with just a few minor suggestions for improvement.

Lots of Love,

Clade Opus

Torantulino commented 4 months ago
Torantulino commented 4 months ago

I'm seriously impressed with Claude Opus's PR Reviews...!