Closed cmosguy closed 1 year ago
What did you set the context window to?
I set the context window to 8096 or something like that... I guess I have to set to smaller amount - like 2048
or should I set it to 2048-60
?
It depends on the model, check the context window of the model. But yes generally it should be set to context_window - max_new_tokens
This issue is stale because it has been open for 30 days with no activity.
Closing for now, feel free to re-open if you're still facing an issue
Any updates?
Keep getting this error message when the
llm-vscode
extension is running.