Open ChristopheBougere opened 7 years ago
@ChristopheBougere Thanks for the input 👍 .
That's an interesting approach. Making the copy&prune running in parallel might speed up things. It would be great if you could submit a PR. We could then compare the output of --verbose
and see the improvements directly.
Hey guys how things are going over there?
Hi @guilhermedecampo , thanks for asking.
The experimental approach in the closed PR was not stable enough, nor did it improve the situation significantly. Additionally, now with full packager support for yarn and npm (and probably others in future), it has changed a bit. E.g. if you use yarn it is already much faster on slower hardware.
But any ideas are welcome.
Thank you for your answer! Yes I'm using yarn.
For yarn you just have to set custom: webpack: packager: 'yarn'
in your serverless.yml and it should give you all benefits of yarn and uses it for packaging.
@HyperBrain I think I have exactly this problem at the moment. My build time is not so significant but it seems that "Packing external modules" takes most of it. I was thinking that instead of parallelizing installation it's possible to utilize caching? My dependencies rarely change but it seems that serverless-webpack does not utilize any caching. Also perhaps npm ci
can be used? because afaik npm install
does not guarantee reproducibility.
Hi @OrKoN , npm install
actually does guarantee reproducability - of course, only if you commit your package-lock.json
. The general problem is, that npm's caching is a lot worse than yarn's. That's why yarn is significantly faster with the plugin.
Can you try to use a very recent npm version, to check if they have eventually improved the caching?
If you use --verbose
the plugin will log the timing information of each step.
@HyperBrain Are you sure about npm install
? I put an example repo here https://github.com/OrKoN/npmtest it defines lodash@2.4.1 via the lock and the version ^2.4.0
in the package.json. If I check out, run npm install, the lock file gets modified and cat node_modules/lodash/package.json | jq -r .version
gives 2.4.2.
P.S. I am using 6.1.0
Thanks for the hint to use --verbose
. Actually, I see that the packaging of external modules is quite fast:
Serverless: Packing external modules: uuid@3.2.1, html-entities@1.2.1, ajv@6.5.0, web-push@3.3.1, jsonwebtoken@8.2.1, axios@0.18.0, marked@0.3.19, paypal-rest-sdk@1.8.1, stripe@6.0.0, oauth@0.9.15
Serverless: Package took [3117 ms]
But then there is a loop over functions:
Serverless: Copy modules: .webpack/fnName [350 ms]
Serverless: Prune: .webpack/fnName [1616 ms]
Serverless: Run scripts: .webpack/fnName [435 ms]
where prune is most time-consuming step taking 1.5 seconds for most of the functions. What does prune
do here?
P.S. I have 29 functions
@OrKoN Prune removes the unneeded packages per function, so that a function only contains the ones it uses. This leads to minimal package sizes. E.g. imagine one of your functions uses the unicode module, which alone is about 20 MB. You do not want to have it included in every function ZIP. So, with the package optimization, only the function that uses it will actually contain it, but not the others.
This optimization is only enabled when packaging individually. If you do not need per function optimization, you can set package: individually: false
in your serverless.yml and all functions will get the same service.zip which is generated once. This service.zip
will contain everything needed for your service then.
BTW: Having 29 functions in one service might be a sign that the service consists of multiple logical services that could be split up into smaller ones.
@HyperBrain it could be split up, but I don't want to manage multiple projects (if API Gateway is managed by Serverless, how would that work if I split functions?). I use package: individually: true
to minimize the size of the functions and that works fine. If I change to package: individually: false
the size would increase a lot.
I guess Serverless or perhaps serverless-webpack
could support caching. Right now I need to deploy all functions at once or know exactly which function to deploy. I think this could be figured out automatically by serverless based on the previous deployment. Also, the result of Copy / Prune / Run steps for every function could be cached. It packages external modules every time when, in fact, it can be only done when dependencies for a function are changed.
I guess this is a ton of work to support smth like this. I will consider splitting the project... But I have already removed custom resources from serverless because it was hard to manage. It looks like now I will have to move the management of API gateway somewhere else. And not much would actually remain...
Hi guys, any updates on this, seems like this issue was completely abondoned.
I experimented with a bunch of different loaders. In the end I found that that it works best with es-build.
Your deploy time will still take some time as its via cloudformation however the build/packaging step should be much faster.
But hopefully this gets you down to 3-5 mins.
Some other tips:
yarn
for packagingaws-sdk
as its available in the lambdasconst path = require('path')
const slsw = require('serverless-webpack')
const nodeExternals = require('webpack-node-externals')
module.exports = {
mode: 'production',
entry: slsw.lib.entries,
devtool: 'source-map',
externals: [
nodeExternals()
],
target: 'node',
resolve: {
extensions: ['.ts'],
},
module: {
rules: [
{
test: /\.ts$/,
loader: 'esbuild-loader',
options: {
loader: 'ts',
target: 'es2020', // node 14, use es2019 if node 12 etc...
},
},
],
}
}
Waking up this issue again... I've done what the previous comment suggested, but unfortunately esbuild-loader doesn't really do anything to address the slowness of copying external modules sequentially.
My builds take over 10 minutes, almost entirely during the packaging external modules step... Using serverless-esbuild takes significantly less time (around 2-3 mins), but unfortunately due to this bug it misses a lot of externals. But it means there must be a way to speed things up for webpack...
This is a Feature Proposal
Description
I'm using serverless webpack with about a dozen of lambda. When I run
serverless deploy
, it is taking about 10 minutes to deploy my service, and the most part of it is actually thePacking external module
step. I just added--verbose option
and it seems that every functions are packaged one after another. Is there a reason to not run in parallel ? I think this could be changed here: https://github.com/serverless-heaven/serverless-webpack/blob/master/lib/packExternalModules.js#L213I could do a PR if necessary, this could really save us a lot of time.
Additional Data