Erikcruzk / TRT

The Transformative Repair Tool
Other
2 stars 0 forks source link

Fix Error when exceeding model's maximum context length #12

Closed Erikcruzk closed 1 year ago

Erikcruzk commented 1 year ago

CRITICAL:root:An exception occurred: This model's maximum context length is 8001 tokens, however you requested 11911 tokens (10729 in your prompt; 1182 for the completion). Please reduce your prompt; or completion length.