farizrahman4u / loopgpt

Modular Auto-GPT Framework
MIT License
1.43k stars 131 forks source link

Support chunking files #6

Open krisquigley opened 1 year ago

krisquigley commented 1 year ago

Fantastic work so far guys!

I'm currently trying to pass it a template for document generation but I am getting the following response:

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 59431 tokens. Please reduce the length of the messages.

Do you plan on supporting longer files? If so, I can have a go at creating a PR if you could point me in the right direction? (I mostly use JS and Ruby, my Python is a bit rusty)

FayazRahman commented 1 year ago

Hey @krisquigley, thanks for raising the issue. We're looking into this, can you tell us more about how you got this response so we can reproduce it on our end?

krisquigley commented 1 year ago

Sure thing, @FayazRahman. Thanks for the quick response!

  1. Set up an agent to generate a document based on a template:

    agent.name = "RubyContractorGPT"
    agent.description = (
    "an AI assistant that produces contracts for Ruby Developers from a template"
    )
    agent.goals = [
    "Write a contract for a Ruby Developer",
    ]
  2. At some point it will want to read from a file, or give it feedback to read from a file immediately: read_from_file, Args: {'file': './training-data/short-form-single-company.rtf'}

  3. Ensure the file is over 4097 tokens and you should see an error along the lines of:

    openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 59431 tokens. Please reduce the length of the messages.
FayazRahman commented 1 year ago

Thanks for the update @krisquigley! We will implement chunking soon.