Closed venkatesh3893 closed 1 month ago
This is working for me right now on Azure. I think you must ensure the model name you're using matches what's in your deployment details. I'm using a different model below but the techniques should be the same. I hope this helps.
models:
- type: main
engine: azure
model: gpt-4o
parameters:
azure_endpoint: https://yourendpoint.openai.azure.com/
api_version: 2023-03-15-preview
deployment_name: GPT-4o
api_key: averylongsetofcharacters
Important: if the api_version can be interpreted as a date, be sure you put it in double quotes. It’s not required here because the version has the -preview suffix. But, if the version is just a date (like many open ai models”, you must put it in quotes. I left them off when using a non preview api version and got errors.
Thanks @PeerBoerner1!
Hi @venkatesh3893, could you follow @PeerBoerner1's solution?
@venkatesh3893 , since you've reacted with a thumbs up, I assume the issue has been resolved, and I proceed to close it.
I am trying to establish guardrails setup for Azure OpenAI models. I have used the below configuration in config.yaml.
models: -type: main engine: azure model: gpt-4 parameters: azure_endpoint: "" api_version: "2023-07-01-preview" openai_api_version: "2023-07-01-preview" deployment_name: gpt-4 api_key:""
I have tried multiple combinations and it just didnt work and no error was thrown.