hashicorp / terraform-provider-azurerm

Terraform provider for Azure Resource Manager
https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs
Mozilla Public License 2.0
4.6k stars 4.64k forks source link

data collection rule from being not show on Log Analytics workspace ContainerLog table #25671

Open sb185296 opened 6 months ago

sb185296 commented 6 months ago

Is there an existing issue for this?

Community Note

Terraform Version

1.8.0.

AzureRM Provider Version

3.98.0

Affected Resource(s)/Data Source(s)

azurerm_monitor_data_collection_rule

Terraform Configuration Files

resource "azurerm_monitor_data_collection_rule" "logs" {
  name                              = "${local.env}-${local.prefix}-datacollection-rules-logs"
 location              = data.azurerm_resource_group.example.location
 resource_group_name   = data.azurerm_resource_group.example.name
  kind                              = "WorkspaceTransforms"
  #data_collection_endpoint_id       = azurerm_monitor_data_collection_endpoint.logs-collection-endpoint.id

  destinations {
    log_analytics {
      workspace_resource_id = module.LOG-ANALYTICS-WORKSPACE.id
      name                  = "Log-filter"
    }
  }

  data_flow {
    streams = ["Microsoft-Table-ContainerLog"]
    destinations = ["Log-filter"]
    transform_kql = "source | where LogEntry startswith '{\"message\"' or LogEntry startswith '{\"level\"' and Computer !startswith 'aks-monitor'"
  }

  description = "DCR ContainerLog Fillter"
}

Debug Output/Panic Output

azurerm_monitor_data_collection_rule.logs: Creating...
azurerm_monitor_data_collection_rule.logs: Creation complete after 6s [id=/subscriptions/-9c50-e/resourceGroups/rg-qa-02/providers/Microsoft.Insights/dllection-rules-logs]

Apply complete! Resources: 1 added, 0 changed, 0 destroyed.

Expected Behaviour

No response

Actual Behaviour

when going to the azure portal under Log Analytics workspace and look on the table

Tables ContainerLog we can see that the table was not not get link to the DCR role that we create

same steps when i do that with the portal manually its working

json the azure portal

{
    "$schema": "https://schema.management.azure.com/schemas/2019-04-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "dataCollectionRules_poc_rule_logs_qa_02_name": {
            "defaultValue": "poc-rule-logs-qa-02",
            "type": "String"
        },
        "workspaces_log_analytics_workspace_qa_02_externalid": {
            "defaultValue": "/subscriptions/0das/resourceGroups/rg-qa-02/providers/microsoft.operationalinsights/workspaces/log-analytics-workspace-qa-02",
            "type": "String"
        }
    },
    "variables": {},
    "resources": [
        {
            "type": "Microsoft.Insights/dataCollectionRules",
            "apiVersion": "2022-06-01",
            "name": "[parameters('dataCollectionRules_poc_rule_logs_qa_02_name')]",
            "location": "uksouth",
            "kind": "WorkspaceTransforms",
            "properties": {
                "dataSources": {},
                "destinations": {
                    "logAnalytics": [
                        {
                            "workspaceResourceId": "[parameters('workspaces_log_analytics_workspace_qa_02_externalid')]",
                            "name": "fe1648c6776443b5aad41af93b7d9549"
                        }
                    ]
                },
                "dataFlows": [
                    {
                        "streams": [
                            "Microsoft-Table-ContainerLog"
                        ],
                        "destinations": [
                            "fe1648c6776443b5aad41af93b7d9549"
                        ],
                        "transformKql": "source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
                    }
                ]
            }
        }
    ]
}

Steps to Reproduce

terraform apply

Important Factoids

No response

References

No response

ziyeqf commented 6 months ago

Hi @sb185296, thanks for opening this issue.

Could you please explain a bit more about "we can see that the table was not not get link to the DCR role that we create"? I checked the payload of the requests made by Terraform, during creation of azurerm_monitor_data_collection_rule, it's almost as same as the template you posted. The only differences are the destinations.log_analytics.name and data_flow. transform_kql. Can you give it another try with updating these two values?

Thanks.

sb185296 commented 6 months ago

hi team

if we go to the portal under ContainerLog image you can see that there is DCR link to that image

but via terraform its not link to that table, the DCR are empty and its request to create new one.

i didn't underhand what the change you ask me to perform ?

ziyeqf commented 6 months ago

Hi @sb185296,

Please try to edit your TF config, replace the original transformKql with the value you provided in the arm template.

sb185296 commented 6 months ago

put that ?

"source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
ziyeqf commented 6 months ago

put that ?

"source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"

yes

sb185296 commented 5 months ago

hi

in the yellow there is now error on the manully role that i create that i was remove it. agien i create new one Data collection rule and still its not shown that its attach to the ContainerLog table.

image

resource "azurerm_monitor_data_collection_rule" "logs" {
  name                              = "${local.env}-${local.prefix}-datacollection-rules-logs"
 location              = data.azurerm_resource_group.example.location
 resource_group_name   = data.azurerm_resource_group.example.name
  kind                              = "WorkspaceTransforms"
  #data_collection_endpoint_id       = azurerm_monitor_data_collection_endpoint.logs-collection-endpoint.id

  destinations {
    log_analytics {
      workspace_resource_id = module.LOG-ANALYTICS-WORKSPACE.id
      name                  = "Log-filter"
    }
  }

  data_flow {
    streams = ["Microsoft-Table-ContainerLog"]
    destinations = ["Log-filter"]
    transform_kql = "source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
  }

  description = "DCR ContainerLog Fillter"
}
sb185296 commented 5 months ago

image image

its after you delete all and create with terrafrom its not update it...agien

ziyeqf commented 5 months ago

I'm looking into this

sb185296 commented 5 months ago

i was google it and see that there is workaroud that working

resource "azapi_resource" "logs" {
  type      = "Microsoft.Insights/dataCollectionRules@2021-09-01-preview"
  name      = "dcr-for-log-filter"
  parent_id = data.azurerm_resource_group.example.id
  location  = data.azurerm_resource_group.example.location

  body = jsonencode(
    {
      kind      = "WorkspaceTransforms"
      properties = {
        description = "Data collection rule for log filter"

      dataFlows = [
        {
          streams = [
            "Microsoft-ContainerLog"
          ],
          destinations = [
            "log-filter"
          ],
          transformKql = "source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
          #outputStream = "Microsoft-Table-ContainerLog"
        }
      ]
        destinations = {
          logAnalytics = [
            {
              name                = "log-filter"
              workspaceResourceId = module.LOG-ANALYTICS-WORKSPACE.id
            }
          ]
        }
      }
    }
  )
}

resource "null_resource" "connect_dcr_to_log_analytics" {

  provisioner "local-exec" {
    command     = <<-EOT
      az monitor log-analytics workspace update --resource-group ${data.azurerm_resource_group.example.name} --workspace-name ${module.LOG-ANALYTICS-WORKSPACE.name} --data-collection-rule ${azapi_resource.logs.id}
    EOT
  }

  depends_on = [
    azapi_resource.logs,
  ]
}

whit the AZ command its working so mabye we have gap in the terrafrom ? or i mabye didnt run the right resource ?

https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-workspace-transformations-api

image

ziyeqf commented 5 months ago

i was google it and see that there is workaroud that working

resource "azapi_resource" "logs" {
  type      = "Microsoft.Insights/dataCollectionRules@2021-09-01-preview"
  name      = "dcr-for-log-filter"
  parent_id = data.azurerm_resource_group.example.id
  location  = data.azurerm_resource_group.example.location

  body = jsonencode(
    {
      kind      = "WorkspaceTransforms"
      properties = {
        description = "Data collection rule for log filter"

      dataFlows = [
        {
          streams = [
            "Microsoft-ContainerLog"
          ],
          destinations = [
            "log-filter"
          ],
          transformKql = "source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
          #outputStream = "Microsoft-Table-ContainerLog"
        }
      ]
        destinations = {
          logAnalytics = [
            {
              name                = "log-filter"
              workspaceResourceId = module.LOG-ANALYTICS-WORKSPACE.id
            }
          ]
        }
      }
    }
  )
}

resource "null_resource" "connect_dcr_to_log_analytics" {

  provisioner "local-exec" {
    command     = <<-EOT
      az monitor log-analytics workspace update --resource-group ${data.azurerm_resource_group.example.name} --workspace-name ${module.LOG-ANALYTICS-WORKSPACE.name} --data-collection-rule ${azapi_resource.logs.id}
    EOT
  }

  depends_on = [
    azapi_resource.logs,
  ]
}

whit the AZ command its working so mabye we have gap in the terrafrom ? or i mabye didnt run the right resource ?

https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-workspace-transformations-api

image

The "az" command works as same as the data_collection_id property of azurerm_log_analytics_workspace: https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/log_analytics_workspace#data_collection_rule_id

So it may require an update for this property. Could you please give it a try?

sb185296 commented 5 months ago

but its for that workspace... i dont wont it to use this spsifce , the update it only for 1 table.

mabye i need ? resource "azurerm_monitor_data_collection_rule_association" "example1" { name = "${local.env}-${local.prefix}-dcra" target_resource_id = module.LOG-ANALYTICS-WORKSPACE.id data_collection_rule_id = azapi_resource.logs.id description = "example" }

sb185296 commented 5 months ago

also i can see even with the terrafrom that i share in the protal its not really working

its not update the source query

and there is error Error: Data collection rule is not valid for kind 'WorkspaceTransforms' The rule of kind 'WorkspaceTransforms' must have exactly one Log Analytics workspace destination (and no other).

image

Nazargora commented 5 months ago

Hi i have the same problem if you find a solutions please ping me

sb185296 commented 5 months ago

Hi i have the same problem if you find a solutions please ping me

hi can you help me ?

Nazargora commented 5 months ago

Hi i have the same problem if you find a solutions please ping me

hi can you help me ?

Hi how I can help you?

sb185296 commented 5 months ago

Hi i have the same problem if you find a solutions please ping me

hi can you help me ?

Hi how I can help you?

how you able to resolve it ? can you sahre?

Nazargora commented 5 months ago

Hi i have the same problem if you find a solutions please ping me

hi can you help me ?

Hi how I can help you?

how you able to resolve it ? can you sahre?

i wrote that I have a same issue and can not resolve it too(

Nazargora commented 3 months ago

Hi i have the same problem if you find a solutions please ping me

hi can you help me ?

Hi how I can help you?

how you able to resolve it ? can you sahre?

Hi I find a solution how to resolve it)

Nazargora commented 3 months ago
  dataFlows = [
    {
      streams = [
        "Microsoft-ContainerLog"
      ],
      destinations = [
        "log-filter"   HERE MUST BE WORKSPACE ID BUT WITHOUT HYPNES !!!!!
      ],
      transformKql = "source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
      #outputStream = "Microsoft-Table-ContainerLog"
    }
  ]
    destinations = {
      logAnalytics = [
        {
          name                = "log-filter"  HERE MUST BE WORKSPACE ID BUT WITHOUT HYPNES !!!!!
          workspaceResourceId = module.LOG-ANALYTICS-WORKSPACE.id
        }   I understand it when creating dcr manually and check export templates. You welcome)
ramaral82 commented 1 month ago

Hi,

On the past I was able to create a DCR and assign it to the desired tables.

I used the code: resource "azurerm_monitor_data_collection_endpoint" "data_collection_endpoint" {

name = "dce-data-platform-${var.env_code}-networking" resource_group_name = module.svc_rg_main.rg_name location = module.svc_rg_main.rg_location public_network_access_enabled = true description = "Data Collection Endpoint for ${var.env_code} environment." tags = var.tags_data_collection_endpoint

} variable "custom_table_names" { default = [ "Custom-DVBDatavaultLoadLog_CL", "Custom-DVBJobLoadLog_CL" ] }

resource "azurerm_log_analytics_saved_search" "custom_tables" { count = length(var.custom_table_names) name = var.custom_table_names[count.index] log_analytics_workspace_id = var.workspace_resource_ids[var.env_code] display_name = var.custom_table_names[count.index] category = "CustomLogs" query = "TableName_CL | project *" }

resource "azurerm_monitor_data_collection_rule" "data_collection_rule" { name = "dcr-data-platform-${var.env_code}-log" resource_group_name = "rg-dataplatform-${var.env_code}-log" location = module.svc_rg_main.rg_location data_collection_endpoint_id = azurerm_monitor_data_collection_endpoint.data_collection_endpoint.id description = "Data Collection Rule for ${var.env_code} environment." tags = var.tags_data_collection_rule

destinations { log_analytics { workspace_resource_id = var.workspace_resource_ids[var.env_code] name = var.workspace_ids[var.env_code] } }

Define the data flow for the data collection rule

data_flow { streams = ["Custom-DVBDatavaultLoadLog_CL"] destinations = [var.workspace_ids[var.env_code]] output_stream = "Custom-DVBDatavaultLoadLog_CL" transform_kql = "source\n| extend TimeGenerated = load_entry_time\n" } data_flow { streams = ["Custom-DVBJobLoadLog_CL"] destinations = [var.workspace_ids[var.env_code]] output_stream = "Custom-DVBJobLoadLog_CL" transform_kql = "source\n| extend TimeGenerated = load_entry_time\n" } stream_declaration { stream_name = "Custom-DVBDatavaultLoadLog_CL" column { name = "load_entry_id" type = "int" } column { name = "load_entry_time" type = "datetime" } column { name = "object_id" type = "string" } stream_declaration { stream_name = "Custom-DVBJobLoadLog_CL" column { name = "load_entry_id" type = "int" } column { name = "load_entry_time" type = "datetime" } column { name = "job_id" type = "string" } } }

Now I'm trying to create a new DCR for other table, using exactly the same code, just changing the name of the DCR (tables are already created directly in portal), but I'm getting an error 400

azurerm_monitor_data_collection_rule.data_collection_rule: Modifying... [id=/subscriptions/xxxxx-xxxxxxxxxx-xxxxxxxxx-xxxxx/resourceGroups/rg-dataplatform-d-log/providers/Microsoft.Insights/dataCollectionRules/dcr-data-platform-d-log] ╷ │ Error: updating Data Collection Rule (Subscription: "xxxxx-xxxf-xxx9-8442-1xxxxxxxxx" │ Resource Group Name: "rg-dataplatform-d-log" │ Data Collection Rule Name: "dcr-data-platform-d-log"): unexpected status 400 (400 Bad Request) with error: InvalidPayload: Data collection rule is invalid │ │ with azurerm_monitor_data_collection_rule.data_collection_rule, │ on data_collection_endpoint.tf line 43, in resource "azurerm_monitor_data_collection_rule" "data_collection_rule": │ 43: resource "azurerm_monitor_data_collection_rule" "data_collection_rule" { │ │ updating Data Collection Rule (Subscription: │ "xxxxxxxxx-28df-4499-8442-xxxxxxxxxxxxxx" │ Resource Group Name: "rg-dataplatform-d-log" │ Data Collection Rule Name: "dcr-data-platform-d-log"): unexpected status │ 400 (400 Bad Request) with error: InvalidPayload: Data collection rule is │ invalid

I'm using this

resource "azurerm_monitor_data_collection_rule" "data_collection_rule_2" { name = "dcr-data-platform-${var.env_code}-log-2" resource_group_name = "rg-dataplatform-${var.env_code}-log" location = module.svc_rg_main.rg_location data_collection_endpoint_id = azurerm_monitor_data_collection_endpoint.data_collection_endpoint.id description = "Data Collection Rule 2 for ${var.env_code} environment." tags = var.tags_data_collection_rule

destinations { log_analytics { workspace_resource_id = var.workspace_resource_ids[var.env_code] name = var.workspace_ids[var.env_code] } }

data_flow { streams = ["Custom-Synapse_Logs_Expanded_CL"] destinations = [var.workspace_ids[var.env_code]] output_stream = "Custom-Synapse_Logs_Expanded_CL" transform_kql = "source\n| extend TimeGenerated = STARTTIME\n" } stream_declaration { stream_name = "Custom-Synapse_Logs_Expanded_CL" column { name = "TimeGenerated" type = "datetime" } column { name = "ACTIVITYITERATIONCOUNT" type = "int" } column { name = "ACTIVITYNAME" type = "string" } column { name = "ACTIVITYRETRYCOUNT" type = "int" }

What i'm doing wrong here????