sigoden / aichat

All-in-one AI CLI tool featuring Chat-REPL, Shell Assistant, RAG, AI tools & agents, with access to OpenAI, Claude, Gemini, Ollama, Groq, and more.
Apache License 2.0
3.98k stars 267 forks source link

Edit last LLM message and continue #601

Closed blob42 closed 3 months ago

blob42 commented 3 months ago

Something that is missing in aichat is the ability to:

I believe these are essential features to unlock the capabilities of LLMs, all famous LLM web UIs have them.

I am willing to make PR if no one is working on those already.

sigoden commented 3 months ago

@blob42

Could you send a video or GIF explaining these features?

I have only seen continue generation in official webuis, not in self-deployed webuis. Can you list projects that support this feature?

blob42 commented 3 months ago

I can cite at least 2 popular web uis:

I will add a gif later

Note: in terms of implementation it would be trivial.

Continue is just sending the list of messages (assistant, user) with the last message being from the assistant. The LLM will try to complete it if it can predict more.

Edit is a special case of continue where we edit the assistant message.

sigoden commented 3 months ago

We will support the continue generation feature. We will not support the edit the last answer from LLM feature because this feature is not commonly used. We will support the regenerate feature.

blob42 commented 3 months ago

We will not support the edit the last answer from LLM feature because this feature is not commonly used.

It's actually the most used feature for advanced LLM usage as it allows to program the LLM to answer in a specific style.

Would you accept a PR for this ?

sigoden commented 3 months ago

No. I have only seen styles defined by prompt, but not by modifying the reply message.

Even the official ChatGPT website doesn't have this button. image

sigoden commented 3 months ago

While the TUI is not ideal for editing individual messages, it excels at editing entire session YAML files.

We can implement an .edit command to:

  1. Save the current session to session YAML file
  2. Open the session YAML file in the user's preferred text editor ($EDITOR).
  3. Reload the session from the saved YAML file.

I think this is the right way for TUI, rather than imitating WebUI.

blob42 commented 3 months ago

Yes this .edit. command would be perfect. Editing the whole session is even better than what GUIs do :+1:

Regarding OpenAI this feature is available in the playground where you can edit the LLM answers. Again this is the most advanced way to interact with an LLM. That is how for example the cli prompt of Llama.cpp works.

blob42 commented 3 months ago

Thanks @sigoden :)

The .edit works pefectly. However .continue is failing with No incomplete response. It supposed to continue by sending back all messages. The LLM might reply by appending to its previous answer or just send the end of output token.

The full usage scenario would be:

sigoden commented 3 months ago

@blob42 The situation of aichat is complicated. It is necessary to add some restrictions. The current implementation does not affect the use of this feature.

blob42 commented 3 months ago

@blob42 The situation of aichat is complicated. It is necessary to add some restrictions. The current implementation does not affect the use of this feature.

Based on what I understand from the code, the ask function is dependant on an input. The .continue feature as I describe it means no input and this requires rethinking how the call to the API is constructed downstream. Is this the complication you are referring to ?

sigoden commented 3 months ago

@blob42 Except for one case (load a new sesison, run .continue). it has no effect on the others. As long as you have a successful input/output, you can use .continue. Is this limitation unbearable for you? Don't try to convince me to remove this restriction. You can use it yourself, it's really not a problem.

sigoden commented 3 months ago

Maybe the error message No incomplete response. misleads you. AIchat doesn’t actually check if the response is complete.

blob42 commented 3 months ago

Except for one case (load a new sesison, run .continue).

Got it, I did not understand what restriction you meant before. Now I understand, it's fine for me.

blob42 commented 3 months ago

Maybe the error message No incomplete response. misleads you.

You are setting the last_message to None. Why not setting the last_message to the last message of the session ? That would fix the .continue feature.

sigoden commented 3 months ago

The input/message is not just text, but also images, videos, etc., which cannot be simply exchanged. I find it troublesome and not worth it

sigoden commented 3 months ago

All features are implemented, so close the issue.

@blob42 Thanks for your contribution.