Closed ziova closed 1 year ago
I will look into that over the holidays. The slow processing time is very painful.
EDIT: It turns out a tweak to chatgpt-wraper can support this.
Do you have a branch with this tweak added that I we could try?
No, I haven't gotten around to implementing this unfortunately. I'd be happy to accept any pull requests.
chatgpt-wrapper already supports this feature, and it's a matter of changing the interface between Python epc server and ChatGPT.el. See https://github.com/mmabrouk/chatgpt-wrapper/issues/37#issuecomment-1385306342
Also, I am not entirely certain, but comint-mode might be helpful here. https://github.com/joshcho/ChatGPT.el/issues/3
Thanks so much for this package!
I've changed chatgpt.py
's bot.ask(query)
to bot.ask_stream(query)
. Results in an error:
"TypeError(\"Object of type '<class 'generator'>' cannot be converted by `tosexp`. It's value is '<generator object ChatGPT.ask_stream at 0x10619a650>'\")"
which makes sense. Any ideas on how a "generator" response could be processed?
Similar to the implementation in gpt.el