sashabaranov / go-openai

OpenAI ChatGPT, GPT-3, GPT-4, DALL·E, Whisper API wrapper for Go
Apache License 2.0
9.01k stars 1.38k forks source link

Azure openai assistans list interface parameter splicing error #794

Closed WhiteNightMo closed 2 months ago

WhiteNightMo commented 2 months ago

Environment:

Screenshots/Logs image

Additional context If the parameter contains paging information, the generated url will contain two question marks. The output I printed:

c.fullURL(urlSuffix): https://xxx.openai.azure.com/openai/assistants?limit=10?api-version=2023-05-15
WhiteNightMo commented 1 month ago

But the baseURL comes from the config file, and it's really the suffix that should be of concern.

baseURL := c.config.BaseURL
baseURL = strings.TrimRight(baseURL, "/")
parseURL, _ := url.Parse(baseURL)

Here's what I printed and it didn't change anything. image

WhiteNightMo commented 1 month ago

@eiixy @sashabaranov

eiixy commented 1 month ago

@WhiteNightMo Could you please check if the unit tests cover your use case? #817

WhiteNightMo commented 1 month ago

@WhiteNightMo Could you please check if the unit tests cover your use case? #817

Thanks for the fix. For me, the /audio/speech interface was missing from the judgment here. I'm not sure if there are other interfaces missing here.

if !containsSubstr([]string{
    "/completions",
    "/embeddings",
    "/chat/completions",
    "/audio/transcriptions",
    "/audio/translations",
    "/images/generations",
}, parseSuffix.Path) {
    return fmt.Sprintf("%s/%s%s?%s", baseURL, azureAPIPrefix, parseSuffix.Path, query.Encode())
}
eiixy commented 1 month ago

@WhiteNightMo Could you please check if the unit tests cover your use case? #817

Thanks for the fix. For me, the /audio/speech interface was missing from the judgment here. I'm not sure if there are other interfaces missing here.

if !containsSubstr([]string{
  "/completions",
  "/embeddings",
  "/chat/completions",
  "/audio/transcriptions",
  "/audio/translations",
  "/images/generations",
}, parseSuffix.Path) {
  return fmt.Sprintf("%s/%s%s?%s", baseURL, azureAPIPrefix, parseSuffix.Path, query.Encode())
}

I couldn't find any information about /audio/speech in the official Azure OpenAI documentation. Could you please provide a reference link to the documentation or further explain the origin and purpose of this endpoint? https://github.com/Azure/azure-rest-api-specs/blob/main/specification/cognitiveservices/data-plane/AzureOpenAI/inference/stable/2024-06-01/inference.yaml

WhiteNightMo commented 1 month ago

OpenAI:https://platform.openai.com/docs/api-reference/audio/createSpeech Azure OpenAI:https://learn.microsoft.com/en-us/azure/ai-services/openai/text-to-speech-quickstart?tabs=command-line#rest-api