This tweaks copilot-chat-shell-cb-prompt to support streaming directly, rather than buffering internally to the (now removed) copilot-chat-shell-maker-answer variable.
In particular, the shell-makerexecute-command is called with a lambda that accepts two arguments, (lambda (response partial) ...). If partial is t, it seems to be treated as a streaming response and appended to the content, without starting a new prompt.
This tweaks
copilot-chat-shell-cb-prompt
to support streaming directly, rather than buffering internally to the (now removed)copilot-chat-shell-maker-answer
variable.In particular, the
shell-maker
execute-command
is called with a lambda that accepts two arguments,(lambda (response partial) ...)
. Ifpartial
ist
, it seems to be treated as a streaming response and appended to the content, without starting a new prompt.Reference: https://github.com/xenodium/chatgpt-shell/blob/f7b1f1e4b8a07c97deba92d9a23145d192ce715f/shell-maker.el#L632
Demo:
https://github.com/user-attachments/assets/6fbe5657-d1f5-4ce9-933a-0cbd38d61dfb
Thanks for publishing this library, very nice!