Open tdkeeley opened 11 months ago
Same problem on our side. We have Workbook based on these columns.
We are also having this issue.
For now with the Azure Monitor queries we are refactoring to use "parse_json", but this is still quite painful when it comes to log exploration, now that we can't easily use the drag-and-drop grouping for these columns.
We investigated this issue and sincerely apologize for the inconvenience caused by the recent changes in log parsing functionality within Azure Container Apps Log Analytics integration.
There are a number of issues with dynamically generated columns based on JSON payloads in Log Analytics. This change addresses those issues and improves logging in general:
parse_json
function. This function parses a JSON text and returns a record of the parsed JSON. Here's a generalized approach to parse a JSON-formatted text in a column in Log Analytics:Assuming Log_s that contains JSON formatted text like this:
{
"Log_message_s": "Example message",
"Log_level_s": "Information",
"Log_trace_id_g": "example-trace-id"
}
Here's a basic example of how you might write a Kusto Query Language (KQL) query to parse the JSON data from that column (https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/parsejsonfunction):
ContainerAppsConsoleLogs_CL
| extend ParsedJSON = parse_json(Log_s)
| project
LogMessage = ParsedJSON.Log_message_s,
LogLevel = ParsedJSON.Log_level_s,
LogTraceId = ParsedJSON.Log_trace_id_g
@sanchitmehta I appreciate that this change will improve performance for logging etc. But the fact is that this change is a breaking change. What about all teams that are dependent on the log format that you had before? Maybe they have built dashboards relying on the previous structure? Alerts? Won't work anymore.
What you should have done is to introduce a opt-in feature to get the logs in this new format. Not the other way around. This is not acceptable.
Just to support @razum90 - this issue is causing us real pain in our production systems. Logs are no longer intuitively searchable, our workbooks with predefined filters and links to the correlated traces in the Application Insights no longer work. Instead of being productive we are spending time on workarounds
We, too, spent some hours investigating why our logs no longer worked – and then some more to fix our saved queries and dashboards depending on log messages. I support @razum90's appeal wholeheartedly.
First and foremost, we'd like to sincerely apologize for the recent changes in the log parsing functionality within Azure Container Apps Log Analytics integration that may have affected your experience. We recognize the importance of continued seamless integration for our users, and we regret the oversight in altering the default behavior without prior notification.
We are working on an emergency fix. Due to the nature of this issue, even rolling back to its original state will again be a breaking change for customers who have since implemented the workaround we posted. Hence, we are requesting customers who need to revert to the old behavior to reach out to us at acasupport(at)microsoft(dot)com
so that we can apply a patch on your specific environments.
We are also working on a fix (after the emergency patching is done), to introduce a property on Container App Environments which makes this behavior configurable, under the name dynamicJsonColumns
. Details of this feature will be shared once it is Generally Available. We will ensure environments which received the emergency patch will be automatically opted into the right values for this configuration property.
Thanks, Azure Container Apps Team
Yesterday, in one of our subscriptions the behavior has reverted to the old one without us having requested this and without prior notice, breaking our log queries again.
@sebastian-hans-swm can you send your subscription id and environment name to acasupport(at)microsoft(dot)com. We will analyze and find out what happened.
@JennyLawrance, I did that (on Nov, 2nd, subject: "Reverted log parsing behaviour") and did not receive feedback of any kind. 😞
First and foremost, we'd like to sincerely apologize for the recent changes in the log parsing functionality within Azure Container Apps Log Analytics integration that may have affected your experience. We recognize the importance of continued seamless integration for our users, and we regret the oversight in altering the default behavior without prior notification.
We are working on an emergency fix. Due to the nature of this issue, even rolling back to its original state will again be a breaking change for customers who have since implemented the workaround we posted. Hence, we are requesting customers who need to revert to the old behavior to reach out to us at
acasupport(at)microsoft(dot)com
so that we can apply a patch on your specific environments.We are also working on a fix (after the emergency patching is done), to introduce a property on Container App Environments which makes this behavior configurable, under the name
dynamicJsonColumns
. Details of this feature will be shared once it is Generally Available. We will ensure environments which received the emergency patch will be automatically opted into the right values for this configuration property.Thanks, Azure Container Apps Team
Is there any release date of this configuration?
Recently, Azure Container Apps Environments introduced a new property called DynamicJsonColumns
to control the logging behavior for JSON formatted logs in Azure Log Analytics.
When set to false
, logs will be stored in the Log
column as raw string regardless of format. When set to true
, JSON formatted logs will automatically be split into dynamic columns in Log Analytics based on the property name of the JSON object, and the Log
column will be empty.
For example, if DynamicJsonColumns = true
, and your log content is:
{
"timestamp":"2023-11-15T22:00:00Z",
"logger":"my.json.logger",
"content":
{
"message":"hello world message",
"level":"info"
}
}
You will see the following columns in Log Analytics:
Log_timestamp_s = 2023-11-15T22:00:00Z
Log_logger_s = my.json.logger
Log_content_message_s = hello world message
Log_content_level_s = info
By default, DynamicJsonColumns
is false
. If you were one of the customers who requested the Product team to revert the JSON parsing changes prior to the introduction of the DynamicJsonColumns
property, then your environment will have DynamicJsonColumns
set to true
with no further action required.
The DynamicJsonColumns
property can be changed through an ARM request or with the Azure CLI.
To change the DynamicJsonColumns
through the Azure CLI, you must have the containerapp
extension installed with version 0.3.44 or above. Please run az version
to check the current version, and az upgrade
to upgrade the CLI to the latest version if necessary.
{
....
"extensions": {
"containerapp": "0.3.44", << Make sure your containerapp extension is in 0.3.44 or above.
}
}
To enable or disable the DynamicJsonColumns
on an existing Container App Environment, you can run the following command:
az containerapp env update --subscription <subscription> --resource-group <resource_group> --name <containerapp-env-name> --logs-dynamic-json-columns true/false
To change the DynamicJsonColumns
using ARM, simply set the property on the Container App Environment resource as shown here:
{
"id": "<resource id>",
"name": "<name>",
"type": "Microsoft.App/managedEnvironments",
"location": "West US",
"properties": {
"appLogsConfiguration": {
"destination": "log-analytics",
"logAnalyticsConfiguration": {
"customerId": "<log-analytics-customer-id>",
"sharedKey": "<log-analytics-key>",
"dynamicJsonColumns": true
}
},
}
}
DynamicJsonColumns
currently cannot be changed from the Azure Portal. This capability is coming soon. DynamicJsonColumns
is only supported for Azure Log Analytics. For Azure Monitor, JSON formatted logs will be store as a raw string in the Log
column.DynamicJsonColumns
for the Consumption workload in the Consumption + Dedicated environment doesn't support today. We are continuously taking steps to improve the service and our processes to ensure such incidents do not occur in the future.
We apologize for any inconvenience.
Regards,
The Microsoft Azure Team
I like writing queries using dot path to access my log data (therefore using parse_json), however:
The function parse_json triggers full parsing of this text which might consume extensive compute resources when original string is large and volume of records is high.
Is there a way NOT to use the DynamicJsonColumns option but still having my Log_s parsed by default as a dynamic column (like Log_json or something)? Or is it exactly the same as adding a parse_json in every query?
Hi @random42, there is no way to achieve what you want. You should use either:
@howang-ms
The DynamicJsonColumns for the Consumption workload in the Consumption + Dedicated environment doesn't support today.
I’m not sure what you mean by that. Can you elaborate?
I've tested dynamicJsonColumns: true
for all workload profile combinations, but neither produces structured logs given a sample log line like this:
{"hostname":"MacBook-Pro-M2.lan","level":"info","msg":"Listening on :8000","time":"2023-12-28T15:01:41+01:00"}
Log_s
column remains empty, so now I’m losing log messages.Log_s
contains the JSON log output written by Container App, so nothing has changed. Log_s
column remains empty…Am I missing something?
@joergjo, what I mean is:
Consumption profile on Consumption + Dedicated tier: Log_s contains the JSON log output written by Container App, so nothing has changed.
This is expected for today. And we are working on the support now, which should be deployed very soon.
But another two scenarios are not expected. You should see new columns like Log_hostname_s in your LA. Can you double check by query the Log_hostname_s column directly? Sometimes the LA will hide the column from default view if you have lot of logs don't have this column. If it still doesn't work, please send your container app env FQDN to acasupport at microsoft dot com. So we can double check.
@howang-ms
I know what you mean, but unfortunately no. See below.
I'll ping the support DL.
Big thanks to the ACA team for getting to the bottom of this. The keyword here is patience - it does take some time before the columns show up 😁.
Hi @howang-ms, is there an update as to whether the Consumption profile on Consumption + Dedicated tier is supported/when this will be supported?
In my project, I use a structured logger, but when I output logs in JSON format, I have trouble reaching the maximum number of columns, and no logs are recorded. so I had no choice but to output logs in text format. It's so bad. Now I found this Issue. I hope this method will resolve your issue. I will try this, thanks team.
This issue is a: (mark with an x)
Issue description
If the payload being logged was in a JSON format, ContainerAppConsoleLogs_CL would parse it into dynamic columns. Log_s always included the JSON string, but if your payload was:
I would see columns for:
Now I don't see any of that anymore. This is a typical payload. My JSON objects don't have more than 6 attributes.
I'm wondering:
I have a number of Azure Monitor queries and alerts set up that depend on columns that no longer are being populated.