Closed m4thfr34k closed 6 years ago
@blueelvis Created a few function apps in different regions, waited for a half hour or so and then tried cold starts.
Here are the times (sent from NC):
Region | 1st | 2nd |
---|---|---|
East Us 2 | 2.094s | 3.225s |
East Asia | 3.601s | 3.331s |
South India | 2.897s | 3.654s |
Pranav's App | 22.736s | 22.727s |
I created mine using the azure portal's function app UI and verified in the app service editor that the file system contained nothing but the index.js and function.json files. How did you create yours? Is it possible that a package.json file is present? If so, it will run an npm install when the VM starts, which was the most common reason I found online for cold starts in the ~30s range.
Single js file 'hello world' app timings with warmed up boxes for different regions (sent from NC):
East US 2
East Asia
South India
@maxnorth - Thanks for the pointer. There were a couple of other functions in a different runtime but on the same instance. I removed those functions and then did a cold start check but still got somewhere around ~26 seconds. Your finding is correct.
I found a package.json in Kudu and during scaling up, it was calling nuget restore
on it which caused this much delay.
The conclusion for me is that if there are other functions hosted in the same instance (even having different runtimes), it would load all of them and not just the one.
This translates roughly to the following scenario -
So, if 2 functions have different modules, it would take quite some time because both of them would require the packages. Even if a single function has a single dependency, and assuming we need at least a dozen functions, we are talking about pulling and installing 12 modules on each cold start of instance. We cannot deploy separate function instances (even in Consumption plan) as maintenance becomes painful.
Edit - I forgot about this but if we are running something like Webpack/Funpack, I don't think we would run into this issue. What do you think? Of course we need to do some more testing.
What do you think?
@blueelvis Yep, we already have webpack incorporated into the build so we won't run into such long times, but we should keep an eye on performance as the application grows and more functions are added, each with their own bundled dependencies. If down the road the application is getting too large with each function bundling it's own dependencies, we can look at configuring webpack to generate a shared module with common dependencies between functions.
@maxnorth can you update this issue with the current status?
Provide timings for cold start/normal responses with chat bot packages (bundled)Update Skipping the second criteria for now, I think we've proven what we need to about cold starts, and although we don't have recorded timings for the function with a chatbot, we are regularly using such a bot and have not seen significantly longer startup times. We should re-visit performance analysis further down the road, once the application is more fully featured.