Open Loo-Ree opened 3 months ago
I have overcame the issue changing the following parameters in main.bicep: chatGptDeploymentCapacity chatGptModelName chatGptModelVersion using values that I have in my deployment. Still I'm wondering why these checks are in place if I'm providing my own resource? Maybe values should be handled as variables so not to change the main.bicep file? It could be part of the customization docs. Anyway, up to you if you want to close the issue or change it as a feature request. Thanks!
Minimal steps to reproduce
Any log messages given by the failure
ERROR: deployment failed: failing invoking action 'provision', error deploying infrastructure: deploying to subscription:
Deployment Error Details: InvalidTemplateDeployment: The template deployment 'openai' is not valid according to the validation procedure. The tracking id is 'bffxxx'. See inner errors for details. CannotChangeDeploymentModel: The model of deployment cannot be changed.
TraceID: cc1xxx
ERROR: error executing step command 'provision': deployment failed: failing invoking action 'provision', error deploying infrastructure: deploying to subscription:
Deployment Error Details: InvalidTemplateDeployment: The template deployment 'openai' is not valid according to the validation procedure. The tracking id is 'bffxxx'. See inner errors for details. CannotChangeDeploymentModel: The model of deployment cannot be changed.
TraceID: cc1xxx
Expected/desired behavior
OS and Version?
azd version?
Versions
Mention any other details that might be useful
Not the first time that I installed successfully this accelerator, unfortunately this round I was out of luck. Tried several times (after purging all created resources), always ended with this error.