Closed abielzulio closed 1 year ago
This extension is so, so good. I wanted to say this is a great feature, because I find that for longer responses I can wait anywhere between 10-30 seconds for a response. Streaming mode would allow us to save time reading the response as it's being generated. Thanks again for all that you do!
I've been working on this feature for a while and seems it's not something I can add in the near future. There is some performance significance on my own (due openai
module, I guess) and it's taking a lot of time to debug. I'll submit the extension first to the store and perhaps can we start from there.
This extension is so, so good. I wanted to say this is a great feature, because I find that for longer responses I can wait anywhere between 10-30 seconds for a response. Streaming mode would allow us to save time reading the response as it's being generated. Thanks again for all that you do!
I've added the feature: https://github.com/raycast/extensions/pull/5253/commits/589fe810845d8df33af12f7421b4d778c0bc9955
This extension is so, so good. I wanted to say this is a great feature, because I find that for longer responses I can wait anywhere between 10-30 seconds for a response. Streaming mode would allow us to save time reading the response as it's being generated. Thanks again for all that you do!