Open bahtman opened 3 weeks ago
A good practice when authenticating with Azure OpenAI is using token providers. Following then example of: https://docs.litellm.ai/docs/providers/azure#azure-ad-token-refresh---defaultazurecredential
The reason for token provider vs. getting token is that the token expires with some time limit.
We get a type error that the token provider is not JSON serializable.
This was also experienced by @arthurgreef in https://github.com/stanfordnlp/dspy/issues/1561#issue-2554542697. Although "fixed" by getting the token instead of providing the callable that handles refreshing of tokens.
I assume this is a byproduct of caching, but the behaviour is still present with cache=False parameter.
A good practice when authenticating with Azure OpenAI is using token providers. Following then example of: https://docs.litellm.ai/docs/providers/azure#azure-ad-token-refresh---defaultazurecredential
The reason for token provider vs. getting token is that the token expires with some time limit.
We get a type error that the token provider is not JSON serializable.
This was also experienced by @arthurgreef in https://github.com/stanfordnlp/dspy/issues/1561#issue-2554542697. Although "fixed" by getting the token instead of providing the callable that handles refreshing of tokens.
I assume this is a byproduct of caching, but the behaviour is still present with cache=False parameter.