mikhailshilkov / mikhailio-hugo

Sources of https://mikhail.io migrated to Hugo
MIT License
12 stars 8 forks source link

Comments to "Comparison of Cold Starts across AWS, Azure, and GCP" #1

Open mikhailshilkov opened 5 years ago

mikhailshilkov commented 5 years ago

Add your comment to Comparison of Cold Starts in Serverless Functions across AWS, Azure, and GCP. The comments will be displayed directly on the page.

mesgme commented 5 years ago

I'm especially surprised that C# cold start takes less time on AWS vs Azure. Are they both running on top of Linux?

Any idea what the reason for this difference is?

mikhailshilkov commented 5 years ago

Both C# runtimes use .NET Core, but AWS runs it on Linux while Azure runs on Windows.

AWS and Azure use completely different infrastructure for their serverless functions, so the difference comes from there, not from .NET per se.

whitetigle commented 5 years ago

I think it would be interesting to add openwhisk/IBM Functions to this comparisons.

RichiCoder1 commented 5 years ago

I'm curious about the C# performance here too. In most analysis I've seen, C# usually leads the pack on AWS. Edit: NVM, this is cold starts.

mpoisot commented 5 years ago

Perhaps you could add Cloudflare Workers to your comparison? They claim to have a superior approach compared to AWS et. al. by running everything within a custom V8 process. Their FAQ makes some outrageous claims that are begging to be real-world tested.

Q: Is there a cold-start time for my Worker? A: ..."in the general case a Worker can be started in less than five milliseconds."

If that's true then their cold start time is 100-1000x better than the competition. Sounds way too good to be true! It would be great to hear your experienced opinion on their approach, price, resource limits, etc. Is it just a neat toy? Only useful for specific scenarios, not general "serverless" computing? Will the big 3 cloud providers end up implementing a similar approach?

liuyiVector commented 4 years ago

how to get the code start time of google cloud function?
I just see the "Function execution took xx ms" in the log, is it the cold start time + execution time? thx~

mikhailshilkov commented 4 years ago

My take is that I don't trust the provider's own reports of cold start duration. I measure end-to-end latency myself and then try to derive the cold-start portion of it empirically. That's definitely not perfect, but at least I have no systemic bias coming from each vendor. In other words, I don't know the answer to your specific question 😃

angelcervera commented 4 years ago

I was thinking about writing this post. Thanks to do it for me!! :smile: I'm moving from AWS projects to Azure and cold start in the second one is a really serious problem.

ernani commented 3 years ago

Nice article. Did you publish your code on how you did obtain these results? That would certainly help others do their own due diligence, also wondering about the functions v3 improvements results. Thanks!

mikhailshilkov commented 3 years ago

All the code is in https://github.com/mikhailshilkov/cloudbench/

wondering about the functions v3 improvements results

I presume they run on the same infrastructure as v2 but that's a good point - I'll do that during my next run.

Manni79 commented 2 years ago

I presume they run on the same infrastructure as v2 but that's a good point - I'll do that during my next run.

Would be great to see a 2022 update with V3. Hope Azure has improved. Thanks a lot for your awesome work!

mithunshanbhag commented 2 years ago

I'd love to see an update to this article, especially since scale controller logs can now be enabled in azure functions.