MicrosoftDocs / azure-docs

Open source documentation of Microsoft Azure
https://docs.microsoft.com/azure
Creative Commons Attribution 4.0 International
10.25k stars 21.42k forks source link

Sizign and limits #84681

Closed andrePKI closed 2 years ago

andrePKI commented 2 years ago

What would be the limitations of this solution? I would like to know what the limits can be of this solution. Like max numer of records returned in a query, max number of invocations in a day, max throughpunt of the blob storage connector and so on.

We (enterprise scale, 15k employees) are looking at an ingestion of about 1,3 Billion events/day concentrated during daytime. That would be about 30.000s event per second for 12 hours. I don't even want to think about data volume (say 1 CEF event is about 1,5kB...1860 TB???) We are never getting that to any Logic App configuration that we could afford... Or are my figures nonsense?


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

MonikaReddy-MSFT commented 2 years ago

Thanks for bringing this to our attention. We will investigate it further and update you shortly.

MonikaReddy-MSFT commented 2 years ago

@andrePKI - Service limits article should help. Azure Monitor service limits Ingestion article might have some relevance - Log data ingestion time in Azure Monitor

Please take a look and let me know if that helps.

MonikaReddy-MSFT commented 2 years ago

Hope that docs were helpful.

At this moment, Since there is no doc update required, We will now proceed to close this thread. If there are further questions regarding this matter, please tag me in your reply. We will gladly continue the discussion and we will reopen the issue.

andrePKI commented 2 years ago

@MonikaReddy-MSFT Hi Monika, Yes, thank you, the docs were helpful. Although it would be nice to have some figures about the limits of this particular solution, some of which are already present on this page, maybe this is not the place for it. Anyway, we find that we can handle our requirements with this solution (not to mention I was a factor 1000 off - 1,8 TB instead of 1860 TB ;-)).