sashabaranov / go-openai

OpenAI ChatGPT, GPT-3, GPT-4, DALL·E, Whisper API wrapper for Go
Apache License 2.0
9k stars 1.38k forks source link

Discussion on Rethinking Design and Responsibility Scope #468

Open vvatanabe opened 1 year ago

vvatanabe commented 1 year ago

Overview

Currently, Go OpenAI supports both OpenAI API and Azure OpenAI Service. Many contributors have been actively supporting both, but over time, differences in specifications between the OpenAI API and Azure OpenAI have been emerging. For instance, discrepancies can be observed in the following:

The code abstracting both OpenAI API and Azure OpenAI Service is becoming increasingly complex, compromising its maintainability as a result. Moreover, it becomes challenging for users to determine which specific fields in the requests or responses are valid for the OpenAI API versus the Azure OpenAI Service.

Given this, there's a perceived need to discuss the future design and responsibility scope of Go OpenAI.

Proposal

I propose the following two approaches:

  1. Separate the code for OpenAI API and Azure OpenAI Service, placing each in different repositories.
  2. Keep the code for both OpenAI API and Azure OpenAI Service in the same repository but separate them into distinct packages.

Each of these approaches has its own merits and demerits. I have summarized each below.

1. Separate the code for OpenAI API and Azure OpenAI Service, placing each in different repositories

Pros:

Cons:

Note: As proposed by Alex, there's also an option to maintain separate libraries for OpenAI API and Azure OpenAI Service within the https://github.com/go-llm organization.

2. Keep the code for both OpenAI API and Azure OpenAI Service in the same repository but separate them into distinct packages

Pros:

Cons:

I would love to discuss this with all of you in this open forum.

zjy282 commented 1 year ago

I suggest option 2. Although OpenAI API and Azure OpenAI Service have gradually produced some differences, I think the same parts of the two are still overwhelmingly

vvatanabe commented 1 year ago

FYI: I found azure-sdk-for-go as the official SDK for Azure. This SDK supports Azure OpenAI Service. https://github.com/Azure/azure-sdk-for-go

I'm asking out of curiosity, but what could be the reason for using Go OpenAI instead of the official SDK for Azure OpenAI Service?

zjy282 commented 1 year ago

FYI: I found azure-sdk-for-go as the official SDK for Azure. This SDK supports Azure OpenAI Service. https://github.com/Azure/azure-sdk-for-go

I'm asking out of curiosity, but what could be the reason for using Go OpenAI instead of the official SDK for Azure OpenAI Service?

I use Go OpenAI for the following reasons:

  1. Start using it when there is no Azure version sdk
  2. The project supports both Azure's request and OpenAI's request
ZeroDeng01 commented 1 year ago

The reason why I use go-openai is mainly the following 3 points:

About Discussion on Rethinking Design and Responsibility Scope

At present, go-openai is compatible with azure openai and openai through the unified entrance, which is very convenient to use. The caller usually does not need to make too many code modifications to be compatible with the APIs of these two service providers. However, as the differentiation between azure and openai increases in the future, the maintenance of the sdk will be very troublesome, so I personally suggest that the sdk of the subsequent two service providers should be placed in a code warehouse and divided into two packages, but the entry method is unified, through Adding a tag parameter or a function pointer allows the entry method to distinguish whether to use the azure or openai package to analyze the return parameters, because there will not be much difference in the difference between azure and openai in terms of requests, mainly because the difference in the returned content is relatively large.

ealvar3z commented 1 year ago

I recommend a clean split and consolidating it all under the go-llm organization

coggsflod commented 10 months ago

@ealvar3z the go-llm project is super interesting, however it's a project that I'm sure many folks would be wary of running in an enterprise production environment given the number of experimental features in the code (ie. arbitrary BASH code execution on the host machine).