OpenInterpreter / open-interpreter

A natural language interface for computers
http://openinterpreter.com/
GNU Affero General Public License v3.0
50.58k stars 4.41k forks source link

New Terminal Option: `--no_live_response` #1278

Open Steve235lab opened 1 month ago

Steve235lab commented 1 month ago

Describe the changes you have made:

Add a new terminal option which allows users to config whether rendering responses while receiving chunks (classic and default behavior) or perform a one-time rendering after all chunks were received (new behavior).

Perform a one-time rendering after all chunks were received will prevent showing duplicate lines in terminal and especially when using via SSH it will reduce bandwidth usage and twinkling.

Reference any relevant issues (e.g. "Fixes #000"):

Temporally fixes #1127

Pre-Submission Checklist (optional but appreciated):

OS Tests (optional but appreciated):

Steve235lab commented 1 month ago

This has annoyed me for a long time since the very first time I use OI, and this could be a not perfect but working solution. Just pull and give it a try, you'll know what I'm talking about.

tyfiero commented 1 month ago

This is so cool, its been an issue for a while. thanks @Steve235lab

KillianLucas commented 2 weeks ago

Hi @Steve235lab, this is fantastic. I am annoyed by the original behavior as well! But I want to float two other solutions.

I think the streaming is an important UX component to lots of modern AI systems, and I think we can fix the issue in two other ways:

  1. --plain — a flag that just removes Rich. It merely would merely print(chunk, end="") the chunks as plain text, more like Ollama. Would also work if someone wanted to pipe OI's output into something else. This should fix all problems, unless there's something deeper about the rate of streaming that's bad for SSH!
  2. Always printing the last 5 messages at the end of a message stream. This would fix the weird repeating behavior, I believe, because you'd scroll up, and it would be 5 solid messages printed at once. It wouldn't fix twinkling during streaming, and it wouldn't help with SSH bandwidth, but it would fix the repeating bug.

What do you think?

Steve235lab commented 2 weeks ago

--plain — a flag that just removes Rich. It merely would merely print(chunk, end="") the chunks as plain text, more like Ollama. Would also work if someone wanted to pipe OI's output into something else. This should fix all problems, unless there's something deeper about the rate of streaming that's bad for SSH!

This one is great, I will implement this later.