ItsPi3141 / alpaca-electron

The simplest way to run Alpaca (and other LLaMA-based local LLMs) on your own computer
MIT License
1.29k stars 144 forks source link

Length of input field is limited. Make it configurable, please #26

Closed stefan-wiesner closed 1 year ago

stefan-wiesner commented 1 year ago

Is your feature request related to a problem? Please describe. A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]

Describe the solution you'd like A clear and concise description of what you want to happen.

Describe alternatives you've considered A clear and concise description of any alternative solutions or features you've considered.

Additional context Add any other context or screenshots about the feature request here.

ItsPi3141 commented 1 year ago

The context size for llama.cpp is limited to 2048 characters. It won't remember anything past that point, so it's pointless to allow longer inputs. If you give it an input longer than 2048 characters, it will just forget the beginning of the prompt.

stefan-wiesner commented 1 year ago

Ok. Did no know that. In chatGPT there is a way to tell it to just do not reply and continue with more input. Maybe thats a way to split it. But its running on CPU so there must be limits

ItsPi3141 commented 1 year ago

You could try getting it to summarize the text one paragraph at a time. And then put all the summarized text together and feed that to the AI again.