Open sam-lippert opened 1 year ago
@sam-lippert we should probably offer 2 or 3 alternatives ... if the request is an async process, you should be able to specify:
a) a webhook
or callback
URL (which we should facilitate via a DO and/or Webhook queue)
b) a nextQueue
which could be a Cloudflare Workers Queue or Kafka Queue via the Kafka.do abstract
c) a poll based approach where the client can call to get the status ... for persisting, we could just write it to a DO (since we should be making all of the requests to OpenAI via a DO anyway to ensure the request isn't lost) ... which then also allows us to continue a given GPT conversation by re-requesting the completion of an input or adding another user or function message for the next completion
@nathanclevenger When the GPT Consumer class finishes processing the message, where should it be stored? If the initial HTTP request is async and returns a 201 or 202, then where should I get the data after it's returned from OpenAI?