huggingface / llm-vscode

LLM powered development for VSCode
Apache License 2.0
1.24k stars 133 forks source link

Getting Input validation error #94

Closed cmosguy closed 1 year ago

cmosguy commented 1 year ago

Keep getting this error message when the llm-vscode extension is running.

Input validation error: `inputs` tokens + `max_new_tokens` must be <= 2048. Given: 3474 `inputs` tokens and 60 `max_new_tokens`
McPatate commented 1 year ago

What did you set the context window to?

cmosguy commented 1 year ago

I set the context window to 8096 or something like that... I guess I have to set to smaller amount - like 2048 or should I set it to 2048-60?

McPatate commented 1 year ago

It depends on the model, check the context window of the model. But yes generally it should be set to context_window - max_new_tokens

github-actions[bot] commented 1 year ago

This issue is stale because it has been open for 30 days with no activity.

McPatate commented 1 year ago

Closing for now, feel free to re-open if you're still facing an issue

hossamrizk commented 5 months ago

Any updates?