Closed mikesoylu closed 1 year ago
Hi, sounds promising! I'll play around with your fork this weekend.
How does the streaming work? It actually updates the buffer in realtime?
Cool π Streaming uses the --no-buffer
option of curl together with the apiβs stream: true
option and inserts every new token into the buffer as theyβre available.
I had a go with it over the weekend, awesome work!! The streaming really makes it feel more responsive!
I like the idea of using the Chat API, since it's faster and cheaper, but the results are worse than the regular completions/edits APIs. It seems like Chat has issues where it ignores the prompt and reverts to being a virtual assistant. I think they intend to fix that in a future model though, so let's keep an eye on it... At the pace that things are going this could be an option in a month or two.
Sounds good, I'll close this for the time being then π
I've added streaming support based on your implementation. Thanks for that!
I ended up using virtual text to display the in-progress stream rather than updating the buffer with every token, which was driving my linter plugins a bit wild π
Hey there love this plugin π₯
I've spun off a hard-fork here that is based on ChatGPT and removes some functionality that you have here (diverges quite a bit from this main)
Changes introduced:
Let me know if you'd like me to open a PR to merge those changes in π