Closed nullbio closed 1 year ago
@nullbio Thank you for the question! Currently, the library does not support streaming, and contributions are very welcome 😄
No worries. Did you have some ideas about how the implementation would look? I can have a go at it, but I'd rather seek some guidance from you first in terms of how it should be done. And if not for me, for whoever else sees this issue and wants to do it.
@nullbio So it seems OpenAI uses server-sent events, here's a great intro about them.
Here are a couple of ideas about implementation:
• High-level API might look like something like that:
stream := client.CreateCompletionStream(...)
for {
dataPiece, err := stream.Recv()
if err == io.EOF {
// stream ended
}
// process dataPiece
}
// inspired by gRPC streaming https://grpc.io/docs/languages/go/basics/#client-side-streaming-rpc
• Here's an example implementation of an SSE client for Go: https://grciuta.medium.com/server-sent-events-with-go-server-and-client-sides-6812dca45c7
• It would be awesome if we could end up without using 3rd-party libraries here. Here's an example of such a library: https://github.com/r3labs/sse
• There's an io.Reader
pattern in Go, it might be useful here https://medium.com/learning-the-go-programming-language/streaming-io-in-go-d93507931185
I just sent a PR with a possible implementation. Please note that I have not consumed this new feature as a client yet. This implementation simply passes all linting and test checks.
Wow, thank you for the PR, it's so cool! 🙌🏻
I've got a couple of comments below, I also would like to test as a client before we merge.
Hey @ealvar3z and @sashabaranov I was just looking for this feature. Glad you're already on it.
I hope the API you come up with supports io.Reader
as that would make it easy to use with all the other Reader/Writers in the standard library without extra boilerplate.
Just modifying the example usage to use io.Copy
would look like that:
package main
import (
"context"
"fmt"
"io"
"os"
gogpt "github.com/sashabaranov/go-gpt3"
)
func main() {
c := gogpt.NewClient("your token")
ctx := context.Background()
req := gogpt.CompletionRequest{
Model: gogpt.GPT3Ada,
Stream: true,
MaxTokens: 5,
Prompt: "Lorem ipsum",
}
resp, err := c.CreateCompletion(ctx, req)
if err != nil {
return
}
if _, err := io.Copy(os.Stdout, resp); err != nil {
fmt.Fprintf(os.Stderr, err.Error())
os.Exit(1)
}
}
Thanks for the great work!
Just merged streaming support into the master branch https://github.com/sashabaranov/go-gpt3/pull/61
I can see that the CompletionRequest takes a Stream bool, but I don't see anything in the code that would indicate streaming support or functionality. I could very easily be missing something though, so wanted to check with you. Thanks.