nomic-ai / gpt4all

GPT4All: Chat with Local LLMs on Any Device
https://nomic.ai/gpt4all
MIT License
68.66k stars 7.52k forks source link

Ability to undo/edit previous request, response #1328

Open kelvincht opened 1 year ago

kelvincht commented 1 year ago

Feature request

Have ability to

  1. edit last request, to get a better response quality
  2. edit previous request/response to tune response quality
  3. select and delete previous request/response to free up unwanted context to increase response quality.

Motivation

I quite like gpt4all because it is easy to setup and "Just works" without complex python setup and complex LLM configs

However one of the limitation of GPT4 compares to LM studio and koboldcpp is the ability to edit previous Request/Response context.

The reasons, is to get good quality response, sometimes I need to remove/tweak previous request/response in middle of context.

Also sometimes if my last prompt didn't get a good response, I want to edit my last prompt to get a quality output

Your contribution

I can provide feedback

cosmic-snow commented 1 year ago

I think a similar request has been made here:


Maybe also earlier ones. I'll update this comment if I find more.


Your first request looks like it's doable because it'll only affect one output and input. It'll have to go back to before the change and process everything again, in any case.

However, if you edit the conversation history in another way, then it wouldn't be "in sync" with the model anymore. So I'm not sure about 2 & 3. (Assuming 2 means going back more than one request/response.)

MaxTheMooshroom commented 7 months ago

Looking into this feature currently, as I consider it necessary functionality for my workflow. I'm considering merkle trees (same structure git uses), and won't need larger contexts since it will recalculate anyhow. If something else is desired behaviour wise, I'm amenable to that as well. Please advise.

edit: for clarity, I'm specifically talking about adding the functionality myself, but I don't want to implement it in an undesired way, hence my request for advising.

cebtenzzre commented 7 months ago

I'm considering merkle trees (same structure git uses), and won't need larger contexts since it will recalculate anyhow

That's probably overkill - llama-cpp-python (which ooba's TGWUI uses) just caches the prompt, looks for what changed, and then only decodes the new part - the whole previous conversation is submitted to llama-cpp-python every time it changes. It also implements caching of previous prompts, either in-memory or on-disk, but TGWUI doesn't use it and I haven't personally found it necessary.

I think we should work towards what llama-cpp-python does.

3Simplex commented 5 months ago

Click the edit button. then change the text. When editing a User Prompt the text can be resubmitted for a new reply. When editing an Assistant Reply the reply is simply saved and entered. Include an option to delete the record opposite from and or within the edit menu to prevent accidental deletion.

EditLastPrompt

ThiloteE commented 4 months ago

I dare to claim: Adding this feature would improve the data quality which is sent to the datalake tremendously!

ThiloteE commented 2 months ago

I really think this issue should be part of the roadmap or at least labeled as medium / high priority feature.