xenodium / chatgpt-shell

A multi-llm Emacs shell (ChatGPT, Claude, Gemini, Ollama) + editing integrations
https://lmno.lol/alvaro
GNU General Public License v3.0
857 stars 76 forks source link

[FR] Streaming for `ob-chatgpt-shell` #158

Closed NightMachinery closed 12 months ago

NightMachinery commented 12 months ago

Without streaming, it's just too slow.

xenodium commented 12 months ago

Thanks for filing. Unfortunately, it's waiting for server response. Not too much we can do there.

NightMachinery commented 12 months ago

@xenodium The server supports streaming output though? See here.

xenodium commented 12 months ago

Ah sure. The shell is already using the streaming API but babel doesn't quite lend itself for streaming out of the box, AFAIK.

NightMachinery commented 12 months ago

@xenodium emacs-jupyter correctly streams all output in babel. In fact, I am currently using Jupyter Python for streaming ChatGPT outputs. Here is an example to verify the streaming support:

#+begin_src jupyter-python :kernel py_base :session emacs_py_1 :async yes :exports both
import time

print("1")
time.sleep(4)
print("2")
#+end_src

#+RESULTS:
: 1
: 2
xenodium commented 12 months ago

If I'm understanding correctly, emacs-jupyter built its own async support since it's not built into babel itself, but I'm not familiar with jupyter. This is a little out of scope for me. Patches welcome though.