TaxyAI / browser-extension

Automate your browser with GPT-4
MIT License
957 stars 385 forks source link

How to handle big html? #16

Open bswhb opened 1 year ago

bswhb commented 1 year ago

I've tried on this url - https://www.igniteui.com/grid/basic-editing, and get the following errors:

This model's maximum context length is 4097 tokens. However, your messages resulted in 11438 tokens. Please reduce the length of the messages.

Is there any way to split the big html into several pieces and consolidate the output from openai?

Christopher-Hayes commented 1 year ago

Chunking is an option, though when you consider the pricing of using thousands of tokens every time you run Taxy, the first thing might be to find how to reduce the token count. GPT-4 supports 8K tokens, but is A LOT more expensive.