Open yanniand opened 6 months ago
Questions have a better chance of being answered if you ask them on the community forums.
Following
I found useful information here: 1- community posts: https://community.grafana.com/t/configure-promtail-loki-to-collect-logs-files-from-cold-storage-azure-storage-account/115009/4
and
https://community.grafana.com/t/collect-log-files-from-kubernets-pods/111478/7
2- stackoverflow question: https://stackoverflow.com/questions/78075726/ingesting-logs-from-azure-blob-storage-to-loki/78152668#78152668
All the audit logs from a running cluster get stored to "insights-logs-*" containers in an Azure storage account. This was already configured in the "diagnostic settings" section in the Azure Kubernetes Service.
I would like to introduce loki for cost-effective logging and have managed to configure the storage in the Helm loki-distributed setup so that the logs get stored to a storage account back-end, after giving the node pool the "Storage Blob Data Contributor" role.
How can I configure promtail in loki-distributed so that it can process the old audit logs stored by Azure diagnostic settings? Unlike other logs, these logs are not generated from a container, rather they have been archived for several moths.
I tried checking if someone else had a similar problem reading log files from Azure storage accounts, but so far only found answers related to storing the logs in Azure.