TaxyAI / browser-extension

Automate your browser with GPT-4
MIT License
1.01k stars 395 forks source link

Context length exceded #21

Open jamesmurdza opened 1 year ago

jamesmurdza commented 1 year ago

I get this error when running any command on Amazon.co.uk:

"This model's maximum context length is 8192 tokens. However, your messages resulted in 8661 tokens. Please reduce the length of the messages."

I guess this is done while trying to maximize the number of tokens within the GPT-4 maximum context size?

paulVu commented 1 year ago

I think problem is scrawl all html, so always get over tokens. I think should not only use GPT, I think should combine with https://webscraper.io and action base on that.

CryptoMitch commented 1 year ago

You're right about using Webscraper, it's a good solution. What if we implement a method to preprocess the HTML content before sending it to GPT-4, reducing the token count while preserving relevant information?

nithinreddyyyyyy commented 1 year ago

How to overcome this issue then?