Closed thw0rted closed 1 year ago
Update: I changed from the all_tests
approach back to including individual spec files, and it did crash with an out-of-heap-memory error. Looks like I'm stuck with one big bundle still, which means the issue persists.
Thanks for opening an issue, the out of memory error is something I've seen reported in the past, but we don't have any reproductions that allow us to debug / triage this at this time. If the project you're working on open source by chance, it would be nice to have a way for us to reproduce this.
Why is it that you are setting runtimeChunk to true
? It looks like internally karma-webpack is relying on the runtime path to specifically be "runtime.js". runtimeChunk true
will change the path of the runtime file and then be unable to pickup and run the tests it expects to see under runtime.js
.
If anything, we may want to limit the ability to set these optimization options in this version of karma-webpack as changing them will lead to unexpected failures such as this.
Though I may be ignorant of times where this is necessary, is there a need for this in your project?
Re: the memory error, the build is for a sizeable application with a few big dependencies. The production bundle size is over 7MB, but the dev build with sourcemaps etc is closer to 60MB (using the default optimizer
options in dev mode).
Unfortunately it's not open source. It was already quite large when we started to add tests, so I can say that the plugin worked fine in "bundle per spec" mode until we had maybe a dozen total spec files. I suspect that part of the problem is that a lot of the code under test uses a big dependency -- the cesium
package is over 3MB and I use a lot of it so tree-shaking isn't very helpful. Maybe some kind of chunk splitting / module federation in dev mode would let the various spec-bundles share vendor code?
Anyway re: runtimeChunk
, I turned it on recently per this guide and, along with several other fixes from that link, I was able to cut roughly 1/3 of my dev build time. I'm only using it in dev, not prod, and I'm certainly not attached to it -- if karma-webpack
wants to force it to a particular value, then that would certainly make sense to me. I just wanted to make sure that I reported the current (erroneous) behavior so that it can be addressed.
ETA: I just realized that this is like a suggestion I made in another thread. When I first set up karma-webpack
, the guide I followed said that you have to delete the entry
section from your webpack config before passing it to the plugin. My suggestion was, why not have the plugin just delete the key, if having a value there makes the plugin not work? Same goes for this, if having optimization.runtimeChunk=true
breaks karma-webpack
, by all means it should override/delete that value.
I agree, at least for the time being while we know that this causes the plugin to fail, we can just override any change to runtimeChunk and log a warning. I actually did make this change in v5.0.0 for entry as well. If you set the property, it will be removed and a warning will be logged.
I'll try and get that into 5.1.0 sometime this week.
I'll also make a project with a ton of dependencies and see if I can force the out of memory error to occur this week as well.
Expected Behavior
All specs run
Actual Behavior
No specs run
Code
How Do We Reproduce?
I haven't put together a separate minimal repro but I'm basically using the config that used to be suggested under the heading "Alternative Usage" in the README:
This file is passed to the Karma config
files
:When Karma is launched, Webpack bundles the tests and all deps into
all_tests.####.js
and creates a separateruntime
script, but for some reason the specs do not execute.I noticed that my config is no longer suggested in the README. I can try switching back to passing each spec to Karma individually (the commented-out "blows up runner memory usage" line above) and if that fixes it (and no longer blows up memory), we can close the issue.