Open daniel-cottone opened 6 years ago
Hi @daniel-cottone , thanks for reporting. This can be something with your configuration. SLS-webpack since 3.0.0 requires that you use slsw.lib.entries for your entry definitions and have the function handlers declared correctly in your serverless.yml in case you use individual packaging.
Can you post the function definitions from your serverless.yml
and the webpack config file?
Hey @HyperBrain thanks for quick response. Here's the webpack configuration:
var path = require('path');
var slsw = require('serverless-webpack');
var Webpack = require('webpack');
module.exports = {
entry: slsw.lib.entries,
resolve: {
extensions: ['.ts', '.js', '.json']
},
target: 'node',
output: {
libraryTarget: 'commonjs',
path: path.join(__dirname, '.webpack'),
filename: '[name].js'
},
externals: [{
'aws-sdk': 'aws-sdk',
'mysql2': 'mysql2',
'sqlite3': 'sqlite3',
'tedious': 'tedious',
'pg-native': 'pg-native'
}],
module: {
loaders: [
{ test: /\.ts(x?)$/, loader: 'ts-loader' }
]
}
};
The definitions for all 40 functions is too large to post, but I'll post an example:
users-list:
handler: src/handler/UserHandler.getUsers
events:
- http:
path: users
method: get
They pretty much all look the same, I've clipped out VPC, authorizer, and environment config. I'm pretty confident that they're all configured correctly.
The handlers look good. However, version 2.x did not support individual packaging (in fact it only copied the whole artifact per function). So you should, as next step, add node externals to your webpack configuration to let the externals be automatically determined by webpack, so that individual packaging can make use of it:
// webpack config
const nodeExternals = require('webpack-node-externals');
...
externals: [ nodeExternals() ]
...
Additionally, webpack > 3.0.0 now uses a module: rules
structure instead of module: loaders
. You should change that too.
Please also check if you have set custom: webpackIncludeModules: true
in your serverless.yml.
Then do a serverless package
to test, if it works. You'll find the zip packages that would be uploaded in the .serverless
directory.
If aws-sdk should be packaged, you can either put it into your devDependencies or use
# serverless.yml
custom:
webpackIncludeModules:
forceExclude:
- aws-sdk
to keep it outside of your packages.
I've made your suggested changes to webpack externals and have added the webpackIncludeModules
configuration to serverless custom config; I still seem to be experiencing the same problem though.
It also appears to be related to the fact that there are so many functions in this serverless project; if I comment out all but 5 then sls package
works.
Hmmm... that sounds like a memory leak somewhere when using individual packaging. We also have a project with more than 30 functions which works, but I did not check how the memory consumption is there (i.e. if we're about to hit a limit).
What you can try is, to increase node's heap memory limit (which is at 1.7GB by default) with:
node --max-old-space-size=4096 node_modules/serverless/bin/serverless package
to 4GB and check if it then passes with the full amount of functions.
If that works, we have to find out, where exactly the memory leak comes from and if it can be fixed by reusing objects.
That definitely seems to be the problem. I got much further along, looks like about 50% of the way through. If I bump it up to 12GB then the process finishes after about 8-10 minutes.
Good to know - thanks for testing this 👍 . Can you adjust the title of the issue to reflect that this will happen with many functions? Then it's more clear how to reproduce it and we can find a solution.
Is the workaround using the increased heap ok for you as long as there's no real fix?
Sure thing. I think the 12GB heap size is probably a bit much; in addition to that it seems to run significantly slower than our build does currently. I'll just opt to not make use of individual packaging for now. If/when this does get fixed I can turn it on then.
The slower runtime is expected, because it takes each webpack compile's output to determine the modules that are really needed for each function and assembles only these for the function package. That takes some time (when using --verbose
you should see the exact steps including their timing).
The longer build outweighs the better startup behavior (if the lambdas are cold started) and if some big dependencies are only used by one function.
@HyperBrain That makes sense, thanks!
I tried rolling back versions until I found one that didn't experience this issue. I got to 2.2.2, at which point my webpack config didn't work anymore.
@BobbieBarker Thanks for the investigation 👍 Support for individual packaging is available since 3.0.0. Versions prior to that (2.x) where just 1.x versions that I released with the most important fixes (the project was quite dead when I took it over). But these old versions did not do invidivual at all.
So I'm quite sure that the memory leak is somewhere in the individual packaging part (maybe the file copy). Did it also happen for you with a serverless package
?
Does anyone here know, if there is a good node performance analyzer (profiler), that can track the heap and the GC (best would be graphically), so that I can see when it starts to allocate objects?
I did some experiments with node's internal profiler node --trace_gc serverless package --verbose
with a project having 20+ functions (JS project).
The outcome is, that there seem to be no critical object remnants (or leaks) in the npm install or copy steps. The only step where memory consumption increases (but is always cleaned up by the GC) is the actual zipping of the function packaged.
This behavior matches the log above: It crashed for you at the webpack step! And it seemed to have loaded the ts-loader multiple times. For my tested JS project, the memory showed roughly the same fill state before and after the webpack run.
So for finding the root issue, we should concentrate on the webpack step and especially typescript. Did you experience the same issue without using typescript with projects that have many functions? It seems that the webpack compile itself runs out of memory here.
I thought a bit about the issue. A workaround could be that the plugin would run the compiles in batches of some functions at once. However I do not know, if the webpack library will free the allocated resources after the compile again. But it could be worth a try.
According to the crash trace it already happened after 7 compiled - if every ts-loader line is for one function - and was at 1500 MB. [42611:0x104001600] 55964 ms: Mark-sweep 1405.7 (1508.8) -> 1405.7 (1508.8) MB, 1721.0 / 0.0 ms allocation failure GC in old space requested
The first try should be to disable some plugins in the webpack.config and check if the ts-loader might allocate all the memory.
I hit this too after setting
package:
individually: true
I don't think I can declare anything else of significance other than having only 9 functions. Do ask tho, I'll check whatever necessary. Here's my webpack:
const {resolve} = require('path');
const slsWebpack = require('serverless-webpack');
const nodeExternals = require('webpack-node-externals');
module.exports = {
target: 'node',
devtool: 'inline-source-map',
entry: slsWebpack.lib.entries,
externals: [nodeExternals()],
output: {
libraryTarget: 'commonjs',
path: resolve('builds/dist'),
filename: '[name].js'
},
resolve: {
extensions: ['.ts', '.js']
},
module: {
rules: [
loader({
test: /\.ts$/,
use: {
loader: 'ts-loader',
options: { }
}
}),
loader({
test: /\.graphqls$/,
use: {
loader: 'graphql-tag/loader',
}
})
]
}
};
function loader(config) {
return Object.assign(config, {
// exclude: [/node_modules/,/builds/, /test/],
});
}
These might be useful:
"ts-loader": "3.1.1",
"serverless": "1.25.0",
"serverless-webpack": "4.2.0",
"webpack": "3.10.0",
"serverless-plugin-cloudfront-lambda-edge": "1.0.0",
The output after running sls package
:
Serverless: Bundling with Webpack...
<--- Last few GCs --->
[52295:0x103000000] 68990 ms: Mark-sweep 1407.7 (1499.7) -> 1407.6 (1477.7) MB, 2150.7 / 0.0 ms (+ 0.0 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 2151 ms) last resort
[52295:0x103000000] 71182 ms: Mark-sweep 1407.6 (1477.7) -> 1407.6 (1477.7) MB, 2191.4 / 0.0 ms last resort
<--- JS stacktrace --->
==== JS stack trace =========================================
Security context: 0x1e68da628799 <JSObject>
1: bind(aka bind) [/Users/Birowsky/Projects/Personal/###obfuscated###/node_modules/typescript/lib/typescript.js:~21002] [pc=0x3a6d9eaeaaef](this=0x307b27782311 <undefined>,node=0x3624801cc691 <NodeObject map = 0x1a660d8c8f59>)
2: forEachChild [/Users/Birowsky/Projects/Personal/###obfuscated###/node_modules/typescript/lib/typescript.js:~12719] [pc=0x3a6...
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
1: node::Abort() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
2: node::FatalException(v8::Isolate*, v8::Local<v8::Value>, v8::Local<v8::Message>) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
3: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
4: v8::internal::Factory::NewByteArray(int, v8::internal::PretenureFlag) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
5: v8::internal::TranslationBuffer::CreateByteArray(v8::internal::Factory*) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
6: v8::internal::compiler::CodeGenerator::PopulateDeoptimizationData(v8::internal::Handle<v8::internal::Code>) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
7: v8::internal::compiler::CodeGenerator::FinalizeCode() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
8: v8::internal::compiler::PipelineImpl::FinalizeCode() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
9: v8::internal::compiler::PipelineCompilationJob::FinalizeJobImpl() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
10: v8::internal::Compiler::FinalizeCompilationJob(v8::internal::CompilationJob*) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
11: v8::internal::OptimizingCompileDispatcher::InstallOptimizedFunctions() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
12: v8::internal::StackGuard::HandleInterrupts() [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
13: v8::internal::Runtime_StackGuard(int, v8::internal::Object**, v8::internal::Isolate*) [/Users/Birowsky/.nvm/versions/node/v8.6.0/bin/node]
14: 0x3a6d9e7846fd
Abort trap: 6
@Birowsky Thanks for the info 🥇 . @BobbieBarker , @daniel-cottone can you confirm, that this setting also works for you?
@HyperBrain That setting does appear to be working for me. I'll look into using fork-ts-checker-webpack-plugin
to maintain type checking. Thanks!
@daniel-cottone please share your thoughts after u succeed. I was thinking on doing a single tsc --noEmit
before deploying, but maybe your approach is more rational.
@Birowsky Seems to work. Only gripe I could have is that the type checking doesn't fail fast; if you would prefer to check types before you even start the build, which could take some time, then maybe tsc --noEmit
is a better option. For now I'm going to stick with just using the plugin
@daniel-cottone I've been dealing with the same issue for a couple weeks now. Using fork-ts-checker-webpack-plugin will spawn a thread per function to type check. I'm wondering if fork-ts-checker is smart enough to do just the type check for the specific lambda or it just type checks the entire project since it's based on tsconfig.json. My project has 20+ functions, fork-ts-checker spawns 20+ threads just for type checking. It works but I don't think it's necessary.
@HyperBrain with transpileOnly: true, it starts to crash around 30+ functions
@HyperBrain @VuBui83 I've also experienced the same problem; setting transpileOnly: true
makes a huge difference but I still get crashes around 30 functions. I've also gone the route of manually type checking with tsc --noEmit
rather than using fork-ts-checker-webpack-plugin
.
Maybe a solution would be to provide a PR for the ts-checker plugin that limits the number of spawned processes when using multi-compiles in webpack.
Did someone here try https://github.com/webpack-contrib/thread-loader in combination with ts-loader or does that make no difference?
I did, still crashed with these loaders
module: {
rules: [
{
test: /\.tsx?$/,
use: [
{ loader: 'cache-loader' },
{
loader: 'thread-loader',
options: {
// there should be 1 cpu for the fork-ts-checker-webpack-plugin
workers: require('os').cpus().length - 1,
},
},
{
loader: 'ts-loader',
options: {
happyPackMode: true,
transpileOnly: true
}
}
]
}
]
}
Serverless: Bundling with Webpack...
<--- Last few GCs --->
[4920:0x391c8b0] 175834 ms: Mark-sweep 1155.1 (1502.1) -> 1155.0 (1503.1) MB, 695.4 / 0.0 ms allocation failure GC in old space requested
[4920:0x391c8b0] 176489 ms: Mark-sweep 1155.0 (1503.1) -> 1154.9 (1450.6) MB, 655.1 / 0.0 ms last resort GC in old space requested
[4920:0x391c8b0] 177153 ms: Mark-sweep 1154.9 (1450.6) -> 1154.9 (1437.6) MB, 663.7 / 0.0 ms last resort GC in old space requested
<--- JS stacktrace --->
==== JS stack trace =========================================
Security context: 0x1f3b69da5501 <JSObject>
1: /* anonymous */ [/mnt/c/AccountServices/node_modules/webpack/node_modules/webpack-sources/node_modules/source-map/lib/source-node.js:~342] [pc=0x3dab59846f57](this=0x3597b2d8c389 <JSGlobal Object>,chunk=0x2c2f5e74b4c1 <String[60]\: xxx('_applySerializers: excludeFields', excludeFields);\n>,original=0x3added1b95b9 <Object map = 0x17ffa9905dd1>)
2: SourceNode_walk [/mnt/c...
FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - JavaScript heap out of memory
1: node::Abort() [node]
2: 0x11f155c [node]
3: v8::Utils::ReportOOMFailure(char const*, bool) [node]
4: v8::internal::V8::FatalProcessOutOfMemory(char const*, bool) [node]
5: v8::internal::Factory::NewUninitializedFixedArray(int) [node]
6: 0xdf2313 [node]
7: v8::internal::Runtime_GrowArrayElements(int, v8::internal::Object**, v8::internal::Isolate*) [node]
8: 0x3dab58f842fd
Aborted (core dumped)
Hi everyone, I spend couple of hours trying to debug this problem. And my conclusion is memory leak in webpack or something else below webpack. I tried with ts-loader, awesome-typescript-loader, thread-loader, cache-loader, happypack, fork-ts-checker-webpack-plugin in any combination.
I wrote test webpack-test.js to debug only webpack, and try in every possible way to lost references to preform GC. Screenshot from node-gc-viewer below.
I see possible workaround, but it's nasty... Invoke child node process (but please not like fork-ts-checker-webpack-plugin) to compile ts with webpack or ... fix webpack 😄
My setup:
tsconfig.json
{
"compilerOptions": {
"sourceMap": true,
"target": "es6",
"types": [
"node"
],
"moduleResolution": "node"
}
}
webpack.config.js
module.exports = {
entry: {},
target: 'node',
output: {
libraryTarget: 'commonjs2',
path: '<absolute patch to project>/.webpack',
filename: '[name].js',
},
module: {
rules: [
{
test: '\.ts(x?)$',
use: [
{
loader: 'ts-loader'
}
],
}
]
},
externals: {
'crypto': true,
'aws-sdk': true
},
resolve: {
extensions: [
'.js',
'.jsx',
'.json',
'.ts',
'.tsx'
],
alias: {
'handlebars': 'handlebars/dist/handlebars.js'
}
}
};
webpack-test.js
const webpackConfig = require('./webpack.config');
const entries = {
'src/handler/AuthorizerHandler': './src/handler/AuthorizerHandler.ts',
'src/handler/yyy/LoginHandler': './src/handler/yyy/LoginHandler.ts',
'src/handler/xxx/GetAllHandler': './src/handler/xxx/GetAllHandler.ts',
'src/handler/xxx/GetOneHandler': './src/handler/xxx/GetOneHandler.ts',
'src/handler/xxx/CreateHandler': './src/handler/xxx/CreateHandler.ts',
'src/handler/xxx/UpdateHandler': './src/handler/xxx/UpdateHandler.ts',
'src/handler/xxx/DeleteHandler': './src/handler/xxx/DeleteHandler.ts',
'src/handler/zzz/GetAllHandler': './src/handler/zzz/GetAllHandler.ts',
'src/handler/zzz/GetOneHandler': './src/handler/zzz/GetOneHandler.ts',
'src/handler/zzz/CreateHandler': './src/handler/zzz/CreateHandler.ts',
'src/handler/zzz/UpdateHandler': './src/handler/zzz/UpdateHandler.ts',
'src/handler/zzz/DeleteHandler': './src/handler/zzz/DeleteHandler.ts',
'src/handler/zzz/PublishHandler': './src/handler/zzz/PublishHandler.ts',
// 'src/handler/qqq/GetAllHandler': './src/handler/qqq/GetAllHandler.ts',
// 'src/handler/qqq/GetOneHandler': './src/handler/qqq/GetOneHandler.ts',
// 'src/handler/qqq/CreateHandler': './src/handler/qqq/CreateHandler.ts',
// 'src/handler/qqq/UpdateHandler': './src/handler/qqq/UpdateHandler.ts',
// 'src/handler/qqq/DeleteHandler': './src/handler/qqq/DeleteHandler.ts',
// 'src/handler/aaa/GetAllHandler': './src/handler/aaa/GetAllHandler.ts',
// 'src/handler/aaa/GetOneHandler': './src/handler/aaa/GetOneHandler.ts',
// 'src/handler/aaa/CreateHandler': './src/handler/aaa/CreateHandler.ts',
// 'src/handler/aaa/UpdateHandler': './src/handler/aaa/UpdateHandler.ts',
// 'src/handler/aaa/DeleteHandler': './src/handler/aaa/DeleteHandler.ts'
};
const queue = [];
for (const key of Object.keys(entries)) {
const value = entries[key];
queue.push([key, value]);
}
let working = false;
let webpack = null;
let compiler = null;
let config = null;
const configJson = JSON.stringify(webpackConfig);
const interval = setInterval(intervalF, 1000);
function intervalF() {
if (working) {
return;
}
if (queue.length === 0) {
console.log('DONE!');
clearInterval(interval);
return;
}
working = true;
const [key, value] = queue.pop();
config = null;
webpack = null;
compiler = null;
config = JSON.parse(configJson);
config.module.rules[0].test = new RegExp(config.module.rules[0].test);
config.entry[key] = value;
console.log(config);
webpack = require('webpack');
console.log('COMPILING', key, value);
compiler = webpack(config, (err, stats) => {
working = false;
console.log('COMPILED', key, value);
});
}
package.json
"devDependencies": {
"@types/aws-lambda": "0.0.22",
"@types/handlebars": "^4.0.36",
"@types/jsonwebtoken": "^7.2.5",
"@types/node": "^8.0.57",
"@types/uuid": "^3.4.3",
"awesome-typescript-loader": "^3.4.1",
"aws-sdk": "^2.176.0",
"cache-loader": "^1.2.0",
"fork-ts-checker-webpack-plugin": "^0.3.0",
"happypack": "^4.0.1",
"serverless": "^1.25.0",
"serverless-domain-manager": "^2.0.2",
"serverless-dynamodb-local": "^0.2.26",
"serverless-kms-secrets": "^1.0.2",
"serverless-offline": "^3.16.0",
"serverless-webpack": "^4.0.0",
"thread-loader": "^1.1.2",
"ts-loader": "^2.3.7",
"typescript": "^2.5.2",
"webpack": "^3.6.0"
},
"dependencies": {
"handlebars": "^4.0.11",
"jsonwebtoken": "^8.1.0",
"uuid": "^3.1.0"
}
Much appreciated effort, Grumpy! When somebody fixes this, instead of all my lambdas weighing 30MB each, most of them will go below 1MB. So trust me, I appreciate efforts like this.
Most feasible workaround for this right now is simply to turn off individual packaging.
I think changing the title to "JavaScript heap out of memory when packaging many functions" makes more sense now that it has been isolated to just the packaging process and not the deployment process.
Has anyone tried if webpack
v4.0.0 can fix this?
@dashmug I tried the RC two days ago and it didn’t fix the problem for me. But I’d like to hear other people’s experience.
@dashmug Webpack 4.0.0 doesn't fix it for me. My project uses babel and the issue seems to happen only when enabling source maps (devtool: 'source-map').
While preparing version 5.0.0, I recognized that we use ts-node
to enable support for TS webpack configuration files. Currently ts-node is referenced as ^3.2.0
in the package.json of the plugin, but I saw that there is already a ^5.0.0
version of ts-node available. Does anybody know if I can upgrade it in the plugin's package.json without breaking anyone's projects or should I keep it at the current version?
YMMV, but I'm currently testing what's in this article about using cache-loader
and thread-loader
.
Initial results are fine so far though I have only tested on my MacBook with 16GB of RAM and will still have to test on our CI which only has 3GB RAM :-).
Working config so far...
'use strict'
const os = require('os')
const path = require('path')
const ForkTsCheckerWebpackPlugin = require('fork-ts-checker-webpack-plugin')
const slsw = require('serverless-webpack')
const webpack = require('webpack')
const nodeExternals = require('webpack-node-externals')
module.exports = {
context: __dirname,
entry: slsw.lib.entries,
target: 'node',
output: {
libraryTarget: 'commonjs2',
path: path.join(__dirname, 'build'),
filename: '[name].js',
},
module: {
rules: [{
test: /\.ts$/,
use: [{
loader: 'cache-loader',
}, {
loader: 'thread-loader',
options: {
// There should be 1 cpu for the
// fork-ts-checker-webpack-plugin
workers: os.cpus().length - 1,
},
}, {
loader: 'ts-loader',
options: {
// IMPORTANT! use happyPackMode mode to speed-up
// compilation and reduce errors reported to webpack
happyPackMode: true,
},
}],
}],
},
externals: [nodeExternals()],
resolve: {
extensions: ['.ts', '.js'],
},
devtool: 'source-map',
plugins: [
new ForkTsCheckerWebpackPlugin({ checkSyntacticErrors: true }),
],
}
@dashmug as far as I remember fork-ts-checker-webpack-plugin compile typescript to javascript fast and spawn thread to check errors. Each of the spawned check threads runs with default 2048 MB memory limit and starts immediately without any queue mechanism. So in the worst case memory usage is lambda count
* memory limit
.
@grumpy-programmer It's a workaround that worked on my local but didn't work on our CI environment (AWS CodeBuild using 3GB). I had to bump up the RAM to 7GB for it to work.
@grumpy-programmer I just inspected the code of https://github.com/Realytics/fork-ts-checker-webpack-plugin to see if there can be any changes done to restrict the amount of processes spawned.
What I've found there is const division = parseInt(process.env.WORK_DIVISION, 10);
which seems to control the amount of worker processes spawned for the plugin.
Can anyone of you try to set process.env.WORK_DIVISION
to a smaller value (maybe 2) and check if the memory consumption still explodes with bigger services? Additionally I found that it uses process.env.MEMORY_LIMIT
to set the Node VM heap size per worker, which could be an additional screw to get it under control.
Bought a new laptop with I8 quad core and 16 gb of ram and this issue is happening more often than on my I5 duo with 8 gb of ram??
I have the same problem but without TS. Disabling sourcemaps helps, but can't be a solution. :( Any hints how to optimize memory consumtion for sourcemap creation?
@akleiber Is this a quite big project where it happens?
You could try to set devtool: "nosources-source-map"
to prevent embedding the whole sources into the source maps but only the line numbers. In most cases this is fully sufficient and might reduce the memory consumption.
However, there are some issues in the webpack repository about the OOM issues in combination of source maps. According to this recent comment https://github.com/webpack/webpack/issues/4727#issuecomment-373692350 it should be solved in the latest source-map module and should be used with the latest webpack version.
What is the webpack version you use?
cache-loader and thread-loader significantly helped for me
Thanks @Birowsky
transpileOnly: true
Definitely something wrong with ts-loader
, setting the transpileOnly
option to true
we went from 9 minutes deployment time to 2 minutes and got rid of the CALL_AND_RETRY_LAST
error.
Our setup:
package.individually = true
I've started to hit extremely long times for webpack to complete and also the javascript heap memory.
My stack has 14 functions
I've upgraded my t2 instance for now but will look at adjusting the heap as I saw above but I'm really concerned about how long it takes to perform the webpack (30 mins at minimum)
I've upgraded to webpack@3.12.0 & serverless-plugin-webpack@1.5.1
My webpack.config.js is below
const StatsWriterPlugin = require("webpack-stats-plugin").StatsWriterPlugin;
const webpack = require("webpack");
const BabiliPlugin = require("babili-webpack-plugin");
module.exports = {
target: "node",
externals: [/aws-sdk/],
plugins: [
new StatsWriterPlugin(),
new BabiliPlugin(),
],
module: {
rules: [
{
test: /src\/.*\.js$/,
exclude: [
/node_modules/,
/test/],
loader: "babel-loader",
options: {
presets: [
[
"env",
{
target: { node: 6.10 }, // Node version on AWS Lambda
useBuiltIns: true,
modules: false,
loose: true,
},
],
"es2015",
"stage-0",
],
plugins: ["transform-runtime" ]
},
},
],
}
};
and my serverless package section looks like
package:
individually: true
include:
- src/**/*
exclude:
- ./*
- test/**
- node_modules/**
I'm no expert in node or webpack so any tips or ideas on how to increase the performance of the packaging would be greatly appreciated. I'm getting around it for now by deploying functions individually but if I need to deploy the whole stack I'm kissing a lot of time goodbye.
Appreciated Thanks
Hi All
I ran the serverless package command while increasing the heap. It completed OK
output below for 1st module
$ node --max-old-space-size=8192 /home/ec2-user/.nvm/versions/node/v7.10.1/lib/node_modules/serverless/bin/serverless package
Serverless: Bundling with webpack...
Child
Time: 519248ms
Asset Size Chunks Chunk Names
src/marlin/captureDeviceAlarm.js 1.52 MB 0 [emitted] [big] src/marlin/captureDeviceAlarm.js
stats.json 107 bytes [emitted]
[26] ../base/src/log.js 432 bytes {0} [built]
[51] external "aws-sdk" 42 bytes {0} [not cacheable]
[66] ../base/node_modules/babel-runtime/core-js/json/stringify.js 95 bytes {0} [built]
[104] ../base/src/helper.js 3.53 kB {0} [built]
[121] ../base/node_modules/babel-runtime/regenerator/index.js 49 bytes {0} [built]
[122] ../base/node_modules/babel-runtime/helpers/asyncToGenerator.js 906 bytes {0} [built]
[338] ./src/marlin/captureDeviceAlarm.js 394 bytes {0} [built]
[339] ../base/node_modules/bole/bole.js 5.11 kB {0} [built]
[344] ../base/node_modules/bole-console/lib/index.js 3.23 kB {0} [built]
[346] ../base/src/device.js 29.1 kB {0} [built]
[484] ../base/src/user.js 6.55 kB {0} [built]
[628] ../base/src/sierra.js 5.69 kB {0} [built]
[629] ../base/src/notifications.js 10.4 kB {0} [built]
[634] ../base/database/marlin-config.js 6.52 kB {0} [built]
[635] ../base/node_modules/axios/index.js 40 bytes {0} [built]
+ 645 hidden modules
Do I need to be concerned about the +645 hidden modules? it that why its taking so long perhaps?
fwiw I implemented the changes that @dashmug mentioned in his post and it looks like my current project is back in business.
I'm finding much better performance by increasing the heap by using
node --max-old-space-size=4096 node_modules/serverless/bin/serverless package
I only ever do a full deploy with increased heap when a new function is created otherwise I now just use sls deploy function when updating a single function
I'd still love to know more about my question re +645 hidden modules and if that indicates a setup or config issue or is normal??
I don't even understand why this is an issue here. I have 7 functions, but all of them are very small. The overall size of the project is a very small project, I run projects much bigger with webpack
with the same loaders (and more stuff) and almost never fall on this heap errors (the last I remember was back on webpack 1), so I don't think the solution here should be focused on changing the loaders configurations, but on the way that serverless-webpack
is executing webpack
.
This is a Bug Report
Description
I'm in the process of trying to upgrade
serverless-webpack
version from2.2.3
, where I do not experience the following issue. Our serverless configuration haspackage: invididually: true
set, and about 40 functions. When I try to upgrade to a later version ofserverless-webpack
and runsls webpack
, the build will run for about a minute and then I get the following error:If I change my serverless config to not package individually,
package: individually: false
then this error goes away. I have tested this with version3.0.0
and the latest,4.1.0
with the same results. Don't have this issue with2.2.3
.Additional Data