AmericanRedCross / damage-assessment-bot

Chatbot for immediate damage and needs assessments
BSD 3-Clause "New" or "Revised" License
7 stars 3 forks source link

Azure Functions Cold Start Analysis #11

Closed m4thfr34k closed 6 years ago

m4thfr34k commented 6 years ago

Update Skipping the second criteria for now, I think we've proven what we need to about cold starts, and although we don't have recorded timings for the function with a chatbot, we are regularly using such a bot and have not seen significantly longer startup times. We should re-visit performance analysis further down the road, once the application is more fully featured.

maxnorth commented 6 years ago

@blueelvis Created a few function apps in different regions, waited for a half hour or so and then tried cold starts.

Here are the times (sent from NC):

Region 1st 2nd
East Us 2 2.094s 3.225s
East Asia 3.601s 3.331s
South India 2.897s 3.654s
Pranav's App 22.736s 22.727s

I created mine using the azure portal's function app UI and verified in the app service editor that the file system contained nothing but the index.js and function.json files. How did you create yours? Is it possible that a package.json file is present? If so, it will run an npm install when the VM starts, which was the most common reason I found online for cold starts in the ~30s range.

maxnorth commented 6 years ago

Single js file 'hello world' app timings with warmed up boxes for different regions (sent from NC):

East US 2 image

East Asia image

South India image

blueelvis commented 6 years ago

@maxnorth - Thanks for the pointer. There were a couple of other functions in a different runtime but on the same instance. I removed those functions and then did a cold start check but still got somewhere around ~26 seconds. Your finding is correct.

I found a package.json in Kudu and during scaling up, it was calling nuget restore on it which caused this much delay.

The conclusion for me is that if there are other functions hosted in the same instance (even having different runtimes), it would load all of them and not just the one.

This translates roughly to the following scenario -

  1. There is a simple function without any npm modules and another function with few modules.
  2. The cold start problem will affect both of them when a new instance comes online because all functions are loaded and so are their modules.

So, if 2 functions have different modules, it would take quite some time because both of them would require the packages. Even if a single function has a single dependency, and assuming we need at least a dozen functions, we are talking about pulling and installing 12 modules on each cold start of instance. We cannot deploy separate function instances (even in Consumption plan) as maintenance becomes painful.

Edit - I forgot about this but if we are running something like Webpack/Funpack, I don't think we would run into this issue. What do you think? Of course we need to do some more testing.

What do you think?

maxnorth commented 6 years ago

@blueelvis Yep, we already have webpack incorporated into the build so we won't run into such long times, but we should keep an eye on performance as the application grows and more functions are added, each with their own bundled dependencies. If down the road the application is getting too large with each function bundling it's own dependencies, we can look at configuring webpack to generate a shared module with common dependencies between functions.

m4thfr34k commented 6 years ago

@maxnorth can you update this issue with the current status?