Open krisquigley opened 1 year ago
Hey @krisquigley, thanks for raising the issue. We're looking into this, can you tell us more about how you got this response so we can reproduce it on our end?
Sure thing, @FayazRahman. Thanks for the quick response!
Set up an agent to generate a document based on a template:
agent.name = "RubyContractorGPT"
agent.description = (
"an AI assistant that produces contracts for Ruby Developers from a template"
)
agent.goals = [
"Write a contract for a Ruby Developer",
]
At some point it will want to read from a file, or give it feedback to read from a file immediately:
read_from_file, Args: {'file': './training-data/short-form-single-company.rtf'}
Ensure the file is over 4097
tokens and you should see an error along the lines of:
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 59431 tokens. Please reduce the length of the messages.
Thanks for the update @krisquigley! We will implement chunking soon.
Fantastic work so far guys!
I'm currently trying to pass it a template for document generation but I am getting the following response:
Do you plan on supporting longer files? If so, I can have a go at creating a PR if you could point me in the right direction? (I mostly use JS and Ruby, my Python is a bit rusty)