Open sb185296 opened 6 months ago
Hi @sb185296, thanks for opening this issue.
Could you please explain a bit more about "we can see that the table was not not get link to the DCR role that we create"?
I checked the payload of the requests made by Terraform, during creation of azurerm_monitor_data_collection_rule
, it's almost as same as the template you posted. The only differences are the destinations.log_analytics.name
and data_flow. transform_kql
. Can you give it another try with updating these two values?
Thanks.
hi team
if we go to the portal under ContainerLog you can see that there is DCR link to that
but via terraform its not link to that table, the DCR are empty and its request to create new one.
i didn't underhand what the change you ask me to perform ?
Hi @sb185296,
Please try to edit your TF config, replace the original transformKql
with the value you provided in the arm template.
put that ?
"source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
put that ?
"source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
yes
hi
in the yellow there is now error on the manully role that i create that i was remove it. agien i create new one Data collection rule and still its not shown that its attach to the ContainerLog table.
resource "azurerm_monitor_data_collection_rule" "logs" {
name = "${local.env}-${local.prefix}-datacollection-rules-logs"
location = data.azurerm_resource_group.example.location
resource_group_name = data.azurerm_resource_group.example.name
kind = "WorkspaceTransforms"
#data_collection_endpoint_id = azurerm_monitor_data_collection_endpoint.logs-collection-endpoint.id
destinations {
log_analytics {
workspace_resource_id = module.LOG-ANALYTICS-WORKSPACE.id
name = "Log-filter"
}
}
data_flow {
streams = ["Microsoft-Table-ContainerLog"]
destinations = ["Log-filter"]
transform_kql = "source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
}
description = "DCR ContainerLog Fillter"
}
its after you delete all and create with terrafrom its not update it...agien
I'm looking into this
i was google it and see that there is workaroud that working
resource "azapi_resource" "logs" {
type = "Microsoft.Insights/dataCollectionRules@2021-09-01-preview"
name = "dcr-for-log-filter"
parent_id = data.azurerm_resource_group.example.id
location = data.azurerm_resource_group.example.location
body = jsonencode(
{
kind = "WorkspaceTransforms"
properties = {
description = "Data collection rule for log filter"
dataFlows = [
{
streams = [
"Microsoft-ContainerLog"
],
destinations = [
"log-filter"
],
transformKql = "source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
#outputStream = "Microsoft-Table-ContainerLog"
}
]
destinations = {
logAnalytics = [
{
name = "log-filter"
workspaceResourceId = module.LOG-ANALYTICS-WORKSPACE.id
}
]
}
}
}
)
}
resource "null_resource" "connect_dcr_to_log_analytics" {
provisioner "local-exec" {
command = <<-EOT
az monitor log-analytics workspace update --resource-group ${data.azurerm_resource_group.example.name} --workspace-name ${module.LOG-ANALYTICS-WORKSPACE.name} --data-collection-rule ${azapi_resource.logs.id}
EOT
}
depends_on = [
azapi_resource.logs,
]
}
whit the AZ command its working so mabye we have gap in the terrafrom ? or i mabye didnt run the right resource ?
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-workspace-transformations-api
i was google it and see that there is workaroud that working
resource "azapi_resource" "logs" { type = "Microsoft.Insights/dataCollectionRules@2021-09-01-preview" name = "dcr-for-log-filter" parent_id = data.azurerm_resource_group.example.id location = data.azurerm_resource_group.example.location body = jsonencode( { kind = "WorkspaceTransforms" properties = { description = "Data collection rule for log filter" dataFlows = [ { streams = [ "Microsoft-ContainerLog" ], destinations = [ "log-filter" ], transformKql = "source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n" #outputStream = "Microsoft-Table-ContainerLog" } ] destinations = { logAnalytics = [ { name = "log-filter" workspaceResourceId = module.LOG-ANALYTICS-WORKSPACE.id } ] } } } ) } resource "null_resource" "connect_dcr_to_log_analytics" { provisioner "local-exec" { command = <<-EOT az monitor log-analytics workspace update --resource-group ${data.azurerm_resource_group.example.name} --workspace-name ${module.LOG-ANALYTICS-WORKSPACE.name} --data-collection-rule ${azapi_resource.logs.id} EOT } depends_on = [ azapi_resource.logs, ] }
whit the AZ command its working so mabye we have gap in the terrafrom ? or i mabye didnt run the right resource ?
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/tutorial-workspace-transformations-api
The "az" command works as same as the data_collection_id
property of azurerm_log_analytics_workspace
: https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/log_analytics_workspace#data_collection_rule_id
So it may require an update for this property. Could you please give it a try?
but its for that workspace... i dont wont it to use this spsifce , the update it only for 1 table.
mabye i need ? resource "azurerm_monitor_data_collection_rule_association" "example1" { name = "${local.env}-${local.prefix}-dcra" target_resource_id = module.LOG-ANALYTICS-WORKSPACE.id data_collection_rule_id = azapi_resource.logs.id description = "example" }
also i can see even with the terrafrom that i share in the protal its not really working
its not update the source query
and there is error Error: Data collection rule is not valid for kind 'WorkspaceTransforms' The rule of kind 'WorkspaceTransforms' must have exactly one Log Analytics workspace destination (and no other).
Hi i have the same problem if you find a solutions please ping me
Hi i have the same problem if you find a solutions please ping me
hi can you help me ?
Hi i have the same problem if you find a solutions please ping me
hi can you help me ?
Hi how I can help you?
Hi i have the same problem if you find a solutions please ping me
hi can you help me ?
Hi how I can help you?
how you able to resolve it ? can you sahre?
Hi i have the same problem if you find a solutions please ping me
hi can you help me ?
Hi how I can help you?
how you able to resolve it ? can you sahre?
i wrote that I have a same issue and can not resolve it too(
Hi i have the same problem if you find a solutions please ping me
hi can you help me ?
Hi how I can help you?
how you able to resolve it ? can you sahre?
Hi I find a solution how to resolve it)
dataFlows = [
{
streams = [
"Microsoft-ContainerLog"
],
destinations = [
"log-filter" HERE MUST BE WORKSPACE ID BUT WITHOUT HYPNES !!!!!
],
transformKql = "source\n| where LogEntry startswith \"{\\\"message\\\"\" or LogEntry startswith \"{\\\"level\\\"\" and Computer !startswith \"aks-monitor\"\n\n"
#outputStream = "Microsoft-Table-ContainerLog"
}
]
destinations = {
logAnalytics = [
{
name = "log-filter" HERE MUST BE WORKSPACE ID BUT WITHOUT HYPNES !!!!!
workspaceResourceId = module.LOG-ANALYTICS-WORKSPACE.id
} I understand it when creating dcr manually and check export templates. You welcome)
Hi,
On the past I was able to create a DCR and assign it to the desired tables.
I used the code: resource "azurerm_monitor_data_collection_endpoint" "data_collection_endpoint" {
name = "dce-data-platform-${var.env_code}-networking" resource_group_name = module.svc_rg_main.rg_name location = module.svc_rg_main.rg_location public_network_access_enabled = true description = "Data Collection Endpoint for ${var.env_code} environment." tags = var.tags_data_collection_endpoint
} variable "custom_table_names" { default = [ "Custom-DVBDatavaultLoadLog_CL", "Custom-DVBJobLoadLog_CL" ] }
resource "azurerm_log_analytics_saved_search" "custom_tables" { count = length(var.custom_table_names) name = var.custom_table_names[count.index] log_analytics_workspace_id = var.workspace_resource_ids[var.env_code] display_name = var.custom_table_names[count.index] category = "CustomLogs" query = "TableName_CL | project *" }
resource "azurerm_monitor_data_collection_rule" "data_collection_rule" { name = "dcr-data-platform-${var.env_code}-log" resource_group_name = "rg-dataplatform-${var.env_code}-log" location = module.svc_rg_main.rg_location data_collection_endpoint_id = azurerm_monitor_data_collection_endpoint.data_collection_endpoint.id description = "Data Collection Rule for ${var.env_code} environment." tags = var.tags_data_collection_rule
destinations { log_analytics { workspace_resource_id = var.workspace_resource_ids[var.env_code] name = var.workspace_ids[var.env_code] } }
data_flow { streams = ["Custom-DVBDatavaultLoadLog_CL"] destinations = [var.workspace_ids[var.env_code]] output_stream = "Custom-DVBDatavaultLoadLog_CL" transform_kql = "source\n| extend TimeGenerated = load_entry_time\n" } data_flow { streams = ["Custom-DVBJobLoadLog_CL"] destinations = [var.workspace_ids[var.env_code]] output_stream = "Custom-DVBJobLoadLog_CL" transform_kql = "source\n| extend TimeGenerated = load_entry_time\n" } stream_declaration { stream_name = "Custom-DVBDatavaultLoadLog_CL" column { name = "load_entry_id" type = "int" } column { name = "load_entry_time" type = "datetime" } column { name = "object_id" type = "string" } stream_declaration { stream_name = "Custom-DVBJobLoadLog_CL" column { name = "load_entry_id" type = "int" } column { name = "load_entry_time" type = "datetime" } column { name = "job_id" type = "string" } } }
Now I'm trying to create a new DCR for other table, using exactly the same code, just changing the name of the DCR (tables are already created directly in portal), but I'm getting an error 400
azurerm_monitor_data_collection_rule.data_collection_rule: Modifying... [id=/subscriptions/xxxxx-xxxxxxxxxx-xxxxxxxxx-xxxxx/resourceGroups/rg-dataplatform-d-log/providers/Microsoft.Insights/dataCollectionRules/dcr-data-platform-d-log] ╷ │ Error: updating Data Collection Rule (Subscription: "xxxxx-xxxf-xxx9-8442-1xxxxxxxxx" │ Resource Group Name: "rg-dataplatform-d-log" │ Data Collection Rule Name: "dcr-data-platform-d-log"): unexpected status 400 (400 Bad Request) with error: InvalidPayload: Data collection rule is invalid │ │ with azurerm_monitor_data_collection_rule.data_collection_rule, │ on data_collection_endpoint.tf line 43, in resource "azurerm_monitor_data_collection_rule" "data_collection_rule": │ 43: resource "azurerm_monitor_data_collection_rule" "data_collection_rule" { │ │ updating Data Collection Rule (Subscription: │ "xxxxxxxxx-28df-4499-8442-xxxxxxxxxxxxxx" │ Resource Group Name: "rg-dataplatform-d-log" │ Data Collection Rule Name: "dcr-data-platform-d-log"): unexpected status │ 400 (400 Bad Request) with error: InvalidPayload: Data collection rule is │ invalid
I'm using this
resource "azurerm_monitor_data_collection_rule" "data_collection_rule_2" { name = "dcr-data-platform-${var.env_code}-log-2" resource_group_name = "rg-dataplatform-${var.env_code}-log" location = module.svc_rg_main.rg_location data_collection_endpoint_id = azurerm_monitor_data_collection_endpoint.data_collection_endpoint.id description = "Data Collection Rule 2 for ${var.env_code} environment." tags = var.tags_data_collection_rule
destinations { log_analytics { workspace_resource_id = var.workspace_resource_ids[var.env_code] name = var.workspace_ids[var.env_code] } }
data_flow { streams = ["Custom-Synapse_Logs_Expanded_CL"] destinations = [var.workspace_ids[var.env_code]] output_stream = "Custom-Synapse_Logs_Expanded_CL" transform_kql = "source\n| extend TimeGenerated = STARTTIME\n" } stream_declaration { stream_name = "Custom-Synapse_Logs_Expanded_CL" column { name = "TimeGenerated" type = "datetime" } column { name = "ACTIVITYITERATIONCOUNT" type = "int" } column { name = "ACTIVITYNAME" type = "string" } column { name = "ACTIVITYRETRYCOUNT" type = "int" }
What i'm doing wrong here????
Is there an existing issue for this?
Community Note
Terraform Version
1.8.0.
AzureRM Provider Version
3.98.0
Affected Resource(s)/Data Source(s)
azurerm_monitor_data_collection_rule
Terraform Configuration Files
Debug Output/Panic Output
Expected Behaviour
No response
Actual Behaviour
when going to the azure portal under Log Analytics workspace and look on the table
Tables ContainerLog we can see that the table was not not get link to the DCR role that we create
same steps when i do that with the portal manually its working
json the azure portal
Steps to Reproduce
terraform apply
Important Factoids
No response
References
No response