Closed JoshVarty closed 9 years ago
wow that took a while! Since processing is serial, we could log whenever we go to consecutive steps and correlate these timestamps with summary on Azure
Did you guys think about using Application Insights for better data visualization on perf and usage?
No, I don't know about it. Is this it? Btw JSON.net took 50 minutes to process.
It's a service on Azure that lets you quite easily get perf and usage telemetry out of your web app.
http://azure.microsoft.com/en-us/services/application-insights/
Btw. is sourceBrowser hosted on Azure?
Yes. I'll follow these instructions: do you think that's ok? http://azure.microsoft.com/en-us/documentation/articles/app-insights-monitor-performance-live-website-now/
This will give you basic data like request rate, response timing etc. Should be useful to start with.
I was thinking more about custom duration event. They require custom code, but give you more insights into how long custom parts of the codebase took to finish. They are described here under Log timed events. It would be great to see how long it takes to get code from github, parse it, analyze, generate html, etc. on upload. How long it take to get search results, etc.
We'll try to set it up and write code to interface with the insights. Thanks for the link! It looks promising
If you set the configs I'm happy to help adding data points into the code later today.
It would also be nice to have some kind of utils class you could just wrap stuff around using an using
statement and it will log perf for that operation at the end. I'll do some research for libraries like this or write my own if find nothing.
The insights are set up. Let me know what do I need to do on the admin side. Also, can we do generic logging using insights?
Yes, you can. You can use events to do so: http://msdn.microsoft.com/en-us/library/dn481100.aspx Adding custom properties to events will let you filter them later in the dashboard by their values.
Question is, how much of logs you want to store and how are you going to use them. I don't know (and can't find any resources) what are the limits of events you can log using Application Insights.
For a lot of data it might be better to use Azure Tables to store logs.
I don't think there is anything you have to do on admin side. If the config is set correctly we should be able to just use the nuget package to log events. I'll look at it.
Newtonsoft.Json is now down to ~1:55. Obviously not ideal to make users wait for that long, but it's a lot more reasonable.
So I was checking that #48 was fixed and I noticed that it took an extremely long time to upload. (~15 minutes).
We should investigate why this was the case and if we can mitigate it. Maybe we can scale our server up a level.
Here's some summary info on the upload. Looks like the CPU usage might be the issue here. We should profile locally and find out bottlenecks. (I suspect it's building the Lucene index)