simonw / llm-gpt4all

Plugin for LLM adding support for the GPT4All collection of models
Apache License 2.0
218 stars 20 forks source link

"The prompt size exceeds the context window size and cannot be processed" doesn't end the process cleanly #36

Open tyson-nw opened 5 months ago

tyson-nw commented 5 months ago

When I get the error ERROR: The prompt size exceeds the context window size and cannot be processed.Exception in thread Thread-2 (run_llmodel_prompt): it stalls out and I have to manually close the cmd window. When this error occurs it should just close the process with a STDERR

As a side note, is there a way to determine the maximum prompt size for a model?