Azure / bicep-types-az

Bicep type definitions for ARM resources
MIT License
86 stars 27 forks source link

Deploying multiple deployments for Microsoft.CognitiveServices/accounts #1730

Open djtje12 opened 1 year ago

djtje12 commented 1 year ago

Bicep version 0.18.4

Describe the bug When deploying multiple model deployments to an openai resource I get the following error:

Another operation is being performed on the parent resource '/subscriptions/subscriptionid/resourceGroups/resourcegroupname/providers/Microsoft.CognitiveServices/accounts/openaitestname'. Please try again later.

To Reproduce Steps to reproduce the behavior: deploy the below bicep:

var modelsToUse = [
  {
    name: 'gpt-35-turbo'
    version: '0301'
    identifier: 1
    capacity: 120
  }
  {
    name: 'text-davinci-003'
    version: '1'
    identifier: 1
    capacity: 60
  }
  {
    name: 'text-embedding-ada-002'
    version: '2'
    identifier: 1
    capacity: 120
  }
]

resource openAi 'Microsoft.CognitiveServices/accounts@2023-05-01' = {
  name: 'openaitestname'
  location: 'westeurope'
  sku: {
    name: 'S0'
  }
  kind: 'OpenAI'
  properties: {
    customSubDomainName: 'openaitestname'
    networkAcls: {
      defaultAction: 'Deny'
    }
    publicNetworkAccess: 'Disabled'
    disableLocalAuth: true
  }

  resource deployments 'deployments@2023-05-01' = [for model in modelsToUse: {
    name: toLower('openaitestname-${model.name}-${model.version}-${model.identifier}')
    sku: {
      name: 'Standard'
      capacity: model.capacity
    }
    properties: {
      model: {
        format: 'OpenAI'
        name: model.name
        version: model.version
      }
    }
  }]
}

Additional context Building in a dependency on every next deployment actually deploys correctly:

resource openAi 'Microsoft.CognitiveServices/accounts@2023-05-01' = {
  name: 'openaitestname'
  location: 'westeurope'
  sku: {
    name: 'S0'
  }
  kind: 'OpenAI'
  properties: {
    customSubDomainName: 'openaitestname'
    networkAcls: {
      defaultAction: 'Deny'
    }
    publicNetworkAccess: 'Disabled'
    disableLocalAuth: true
  }

  resource deployments 'deployments@2023-05-01' = [for model in [ modelsToUse[0] ]: {

    name: toLower('openaitestname-${model.name}-${model.version}-${model.identifier}')
    sku: {
      name: 'Standard'
      capacity: model.capacity
    }
    properties: {
      model: {
        format: 'OpenAI'
        name: model.name
        version: model.version
      }
    }
  }]

  resource deployments2 'deployments@2023-05-01' = [for model in [ modelsToUse[1] ]: {
    name: toLower('openaitestname-${model.name}-${model.version}-${model.identifier}')
    sku: {
      name: 'Standard'
      capacity: model.capacity
    }
    properties: {
      model: {
        format: 'OpenAI'
        name: model.name
        version: model.version
      }
    }
    dependsOn: [
      openAi::deployments
    ]
  }]

  resource deployments3 'deployments@2023-05-01' = [for model in [ modelsToUse[2] ]: {
    name: toLower('openaitestname-${model.name}-${model.version}-${model.identifier}')
    sku: {
      name: 'Standard'
      capacity: model.capacity
    }
    properties: {
      model: {
        format: 'OpenAI'
        name: model.name
        version: model.version
      }
    }
    dependsOn: [
      openAi::deployments2
    ]
  }]
}

It looks to me that an OpenAi resource can't handle deploying multiple model deployments at the same time, is that something we can build in to bicep to check/validate or is this expected behaviour and should it be documented more properly?

jongio commented 1 year ago

Use @batchSize(1) decorator https://github.com/Azure/azure-dev/blob/main/templates/common/infra/bicep/core/ai/cognitiveservices.bicep#L26