Open gregwdata opened 2 months ago
Thank you for posting this @gregwdata! I've been banging my head against the wall for the past week trying to figure out what was wrong.
Unfortunately another instance where Azure OpenAI is not able to keep up parity with OpenAI. This kind of discrepancy makes it really tough to develop on Azure. It’s bad enough having to wait weeks for the new models.
For my case, using openai = "^1.26.0" and langchain-openai = "0.1.8" would not be a problem. If anyone are using langchain you can rollback the version first to temp fix the problem
For my case, using openai = "^1.26.0" and langchain-openai = "0.1.8" would not be a problem. If anyone are using langchain you can rollback the version first to temp fix the problem
Issue has nothing to do with OpenAI or Langchain. It is the Azure OpenAI API that is behind. There is no temp fix for this problem. It requires Microsoft to support the parameter in their API.
I had the same issue. Downgrading from openai==1.35.7 to openai==1.34.0 solved the problem.
@iscai-msft Please help take a look, thanks.
parallel_tool_calls
is not supported by the Azure OpenAI service yet. When it is supported, it will be added to the spec under the API version which supports its usage. You can follow the what's new page on MS Learn for updates.
@kristapratico I believe there is still a bug here even though parallel_tool_calls
is not supported yet.
Because:
parallel_tool_calls
to True by defaultparallel_tool_calls
to False So, parallel_tool_calls
is ON when using the Azure OpenAI service, and there is no way to turn it off.
@wooters maybe I misunderstand, but the support coming would allow for the ability to turn parallel_tool_calls
off. Both services have this on by default, and once supported by Azure, you'll be able to disable it, too.
@kristapratico Thanks for your response. I guess I misunderstood your comment:
parallel_tool_calls
is not supported by the Azure OpenAI service yet.
By this, I guess what you mean is: "the ability to use the parallel_tool_calls
arg to turn OFF parallel tool calls."
Thanks for clarifying. I look forward to this being added to Azure.
@kristapratico Can I ask when you plan to support this feature? Just following the changelog seems quite hopeless, and there isn’t any useful information at all. At the very least, I would like a patch to make it compatible with the OpenAI API function signature, even if it’s not yet supported.
@kristapratico it would be really great if there was a feature release roadmap. Even if the time lines are by quarter.
We would like answers to the questions "when is feature A that OpenAI has already release going to be available on Azure OpenAI Inference API?" right now specifically parallel_tool_calls
and tool_choice=required
. Released on OpenAI in June and April respectively. When will they be available on Azure? How do we get the answer to that question?
This is blocking our ability to use Azure services. We need to be able to disable parallel_tool_calls.
Waiting to support this feature. I can't understand why the team from Azure is so late like this.
Another reason not to use azure.
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unknown parameter: 'parallel_tool_calls'. (request id: 20240812154615201168063zjM398T3)", 'type': 'invalid_request_error', 'param': 'parallel_tool_calls', 'code': 'unknown_parameter'}}
Excuse me, two months have passed without any reply or progress?
No one working on this?
Hello! Providing a quick update on disabling parallel_tool_calls
-- the work to support this parity feature across models/deployments is nearly complete and currently slated for the next service API release; barring unforeseen issues, it'll be part of a 2024-09-01-preview
api-version label in a few weeks. We try to not share precise dates like this most of the time since so much can still be subject to change, but with how old this is it's owed a more concrete ETA.
As another engineer building on top of Azure OpenAI APIs (my team is responsible for client library integration), I empathize with the frustration and agree that it's taken way longer than anyone would like for this seemingly simple parity feature to become available. As for why it's taken so long, it boils down to the architectural differences between OpenAI's global endpoint and Azure OpenAI's deployment-based system: Azure OpenAI has component boundaries and abstractions that OpenAI doesn't, and while that confers some nice advantages in a lot of circumstances, it also turns out that consistently having properly configured traffic routed across many permutations of versioned model deployments, regions, and provisioning setups has revealed a lot of rough edges that spanned many teams. That's not an attempt to make an excuse -- we're using this and similar instances to continually improve these annoying parity latency situations -- but rather just to confirm that it's not been neglected or abandoned. It's just been a lot more complicated than anyone would've guessed.
Thank you for the continued patience!
Thanks so much for the update @trrwilson. Much appreciated!
@trrwilson related: do you know if the next service API release will include the ability to set
tool_choice: "required"
? (as described here and here)
That one's there now! required
was recently made available on the 2024-07-01-preview
service API label; here's the line where it's newly defined in the spec and here's the prior preview version where it was conspicuously absent.
We haven't yet published client library updates for all languages and protocol/REST-level calls may needed for those cases (.NET/Java/Go), but from an API perspective it's now supported.
@trrwilson really appreciate the update and especially some of the details you shared. Being as transparent as you can with this stuff goes a long way. I appreciate the difficulty of giving timelines but even just seeing things in a now/next/later roadmap would be great.
@trrwilson re feature like tool_choice=required
what is the source of truth to know what's actually deployed? Given this particular feature was in the API spec of the latest stable version but not actually working, backend throwing a option not suport error.
@trrwilson related: do you know if the next service API release will include the ability to set
tool_choice: "required"
? (as described here and here)That one's there now!
required
was recently made available on the2024-07-01-preview
service API label; here's the line where it's newly defined in the spec and here's the prior preview version where it was conspicuously absent.We haven't yet published client library updates for all languages and protocol/REST-level calls may needed for those cases (.NET/Java/Go), but from an API perspective it's now supported.
FYI - just tried tool_choice=required
and not working for us. Guessing best to handling via a support request.
Is there any update on this? This breaks langgraph pretty horridly as it doesn't natively support parallelism
Is seems parallel_tool_calls parameter is supported with Azure OpenAI API version 2024-08-01-preview released yesterday.
just been upgrading and test (via the C# lib) against a new 06-08-2024 deployment in eastus - though the Azure SDK doesn't have the REST API version add, so using reflection to set that).
API Spec link
https://github.com/Azure/azure-rest-api-specs/blob/main/specification/cognitiveservices/data-plane/AzureOpenAI/inference/preview/2024-05-01-preview/generated.json
API Spec version
2024-05-01 preview
Describe the bug
support for the
parallel_tool_calls
parameter on the OpenAI API Spec: https://platform.openai.com/docs/api-reference/chat/create#chat-create-parallel_tool_calls is not implemented in the Azure API specAppears to be the cause of this issue filed on the OpenAI repo: https://github.com/openai/openai-python/issues/1492
Expected behavior
Use parameter as described in: https://platform.openai.com/docs/api-reference/chat/create#chat-create-parallel_tool_calls
Actual behavior
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: parallel_tool_calls. Please contact us through an Azure support request at: https://go.microsoft.com/fwlink/?linkid=2213926 for further questions.', 'type': 'invalid_request_error', 'param': None, 'code': None}}
Reproduction Steps
Environment
No response