Open benmonro opened 4 years ago
Fixed version of JSDOM has been released, you can use it by installing jest-environment-jsdom-sixteen
. @EvHaus could you verify it fixes your issue?
@SimenB my issue is probably not related, but I tried jest-environment-jsdom-sixteen
vs using the default, and saw a 20s increase in runtime for the same test suite over repeated runs.
over using v15 (which is what ships with jest by default) and no other changes? Could you test with 16.1.0 as well (although that leaks memory, so might be harder to test). JSDOM just landed with custom element support, there might be some regression in there? Not sure
Fixed version of JSDOM has been released, you can use it by installing jest-environment-jsdom-sixteen. @EvHaus could you verify it fixes your issue?
Unfortunately still getting heap errors with jest-environment-jsdom-sixteen
. The last stable working version of JSDom for me is jest-environment-jsdom-thirteen
.
Fixed version of JSDOM has been released, you can use it by installing
jest-environment-jsdom-sixteen
. @EvHaus could you verify it fixes your issue?
The environment works with our codebase, but we're still seeing an almost 100% regression in runtime. Anecdotally jest-environment-jsdom-sixteen
seems to improve runs time performance by 10%ish only when using 25.1
vs 24.9
Hi @SimenB,
I've made the reproducible case here https://github.com/olebedev/jest-perf-issue. Please take a look. Below result to compare. /cc @joscha
Results
Benchmarks were run on
MBP 2019, 32Gb RAM, i9-8950HK CPU @ 2.90GHz
.
jest version branch time 24.9.0
master
348.077s
25.1.0
jest25
591.33s
In our case, where jest v25.1 were ~50% slower compared to v24.9, now the latest jest v25.2.0 is even further 20% slower compared to v25.1 🙈
@olebedev Woah, that's painful 😬
I'm getting similar numbers to you. If it's based on a real project I recommend using v8 coverage. It takes the runtime from 600s to 35s on my machine in your reproduction. The reason for the huge diff is probably that we don't try to instrument non-covered files with v8 coverage (we just say every byte is uncovered, which works with v8).
https://jestjs.io/docs/en/configuration#coverageprovider-string
Not sure why it's so slow... I'll try to find some time to look into it (won't be anytime soon though). It should at least not be any slower on v25 than v24
Do I understand the docs correctly that we can use v8 coverage together
with the ...-sixteen
environment?
Cheers, J
On Wed, 8 Apr 2020, 22:33 Simen Bekkhus, notifications@github.com wrote:
@olebedev https://github.com/olebedev Woah, that's painful 😬
I'm getting similar numbers to you. If it's based on a real project I recommend using v8 coverage. It takes the runtime from 600s to 35s on my machine in your reproduction. The reason for the huge diff is probably that we don't try to instrument non-covered files with v8 coverage.
https://jestjs.io/docs/en/configuration#coverageprovider-string
Not sure why it's so slow... I'll try to find some time to look into it. It should at least not be any slower on v25 than v24
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/facebook/jest/issues/9457#issuecomment-610931770, or unsubscribe https://github.com/notifications/unsubscribe-auth/AABN5BW6R45SS5Z5GXGFGF3RLRVJLANCNFSM4KK67LOA .
Yes, either jest-environment-jsdom-sixteen
, or the bundled jest-environment-node
. Also note that it's only supported on node 10+, not node 8.
(I've ever only tested this with the Node env, but if it doesn't works with the jsdom env that's a bug - please open up a separate issue 🙂)
jest-environment-jsdom-sixteen + v8 as coverage provider was worse by about 20% for us on jest 25.3.0, node 12.16. We're also trying to debug why our test performance got worse by about 80% going from jest 24 to 25.
@joual did you try the micromatch workaround (downgrade to 3.x)?
Having a similar experience here, test times (without coverage) double on v25 from 35-40 seconds to 80-90 and sometimes more. Tried locking micromatch on v3, no measurable difference. Fwiw, we have around 3k tests of which 58 are snapshot tests. Attempted to downgrade jsdom but this seems to introduce lots of test breakage due to recent features we've been using. WIll see if I can get around this somehow and report back.
@SimenB The coverage collect job on prettier project is also very slow, can you check it? https://github.com/prettier/prettier/runs/579497097 Node.js 12 on ubuntu-latest
collects coverage, other job don't.
Having a similar experience here, test times (without coverage) double on v25 from 35-40 seconds to 80-90 and sometimes more. Tried locking micromatch on v3, no measurable difference. Fwiw, we have around 3k tests of which 58 are snapshot tests. Attempted to downgrade jsdom but this seems to introduce lots of test breakage due to recent features we've been using. WIll see if I can get around this somehow and report back.
Was experimenting with different jsdom versions today on jest@24 (which is v11 by default). Up to v14 everything seems to work fine, but as of v15 test runs take consistently 50-60% longer. Same story in v16. Will see if I can get similar perf on jest@25 by downgrading jsdom to v14.
JSDOM@16.2.2 has some memory fixes, @EvHaus could you try it out? Jest's own --detect-leaks
finds leaks in previous versions, but not 16.2.2.
I've also landed some other improvements if you have lots of symlinks (which is super slow on windows), so if people could try out jest@25.5.3 that would be lovely 🙂
@SimenB What's the easiest way to test that? If I add jsdom@16.2.2
directly as a devDependency Jest ignores that and uses whatever is bundled with jest-environment-jsdom
which is 15.2.1 today.
How can I trick npm
to ensure it's using the jsdom version I want?
Install jest-environment-jsdom-sixteen and use it https://jestjs.io/docs/en/configuration#testenvironment-string
Alpha published, so you can try jest@next
- it comes with jsdom 16 out of the box
@SimenB Sorry, not much luck with jest@6.0.0-alpha.0
and its jsdom@16.2.2
. Still getting many of these errors:
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
And eventually the runner dies.
My details:
> npx envinfo --preset jest
System:
OS: macOS 10.15.4
CPU: (8) x64 Intel(R) Core(TM) i5-8279U CPU @ 2.40GHz
Binaries:
Node: 10.17.0 - ~/.nvm/versions/node/v10.17.0/bin/node
Yarn: 1.22.4 - /usr/local/bin/yarn
npm: 6.14.4 - ~/.nvm/versions/node/v10.17.0/bin/npm
npmPackages:
jest: ^26.0.0-alpha.0 => 26.0.0-alpha.0
Here are some of the full stacks returned from those:
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
1: 0x10003d041 node::Abort() [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
2: 0x10003d24b node::OnFatalError(char const*, char const*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
3: 0x1001b8e25 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
4: 0x100586d82 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
5: 0x100589855 v8::internal::Heap::CheckIneffectiveMarkCompact(unsigned long, double) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
6: 0x1005856ff v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
7: 0x1005838d4 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
8: 0x10059016c v8::internal::Heap::AllocateRawWithLigthRetry(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
9: 0x1005901ef v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
10: 0x10055fb34 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
11: 0x1007e7e14 v8::internal::Runtime_AllocateInNewSpace(int, v8::internal::Object**, v8::internal::Isolate*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
12: 0x3f82f50dbe3d
13: 0x3f82f5c630de
14: 0x3f82f5c94431
15: 0x3f82f5c7d3be
16: 0x3f82f5c4e98b
17: 0x3f82f5c3c38e
<--- Last few GCs --->
[50818:0x102804000] 189738 ms: Mark-sweep 1288.8 (1450.6) -> 1280.2 (1454.1) MB, 890.1 / 0.1 ms (average mu = 0.181, current mu = 0.061) allocation failure scavenge might not succeed
[50818:0x102804000] 190673 ms: Mark-sweep 1292.8 (1454.1) -> 1282.9 (1457.6) MB, 856.2 / 0.2 ms (average mu = 0.136, current mu = 0.084) allocation failure scavenge might not succeed
<--- JS stacktrace --->
==== JS stack trace =========================================
0: ExitFrame [pc: 0x3f82f50dbe3d]
Security context: 0x274d67c9e6e9 <JSObject>
1: createImpl [0x274d6d9ba1b9] [/Users/evhaus/Git/zenhub/client/node_modules/jsdom/lib/jsdom/living/generated/HTMLInputElement.js:~47] [pc=0x3f82f5c630de](this=0x274d51911261 <Object map = 0x274dd51fe489>,globalObject=0x274d89d38609 <Window map = 0x274d2fe6c211>,constructorArgs=0x274d832134b1 <JSArray[0]>,privateData=0x274d832134d1 <Object map = 0x274d69...
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
1: 0x10003d041 node::Abort() [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
2: 0x10003d24b node::OnFatalError(char const*, char const*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
3: 0x1001b8e25 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
4: 0x100586d82 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
5: 0x100589855 v8::internal::Heap::CheckIneffectiveMarkCompact(unsigned long, double) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
6: 0x1005856ff v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
7: 0x1005838d4 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
8: 0x10059016c v8::internal::Heap::AllocateRawWithLigthRetry(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
9: 0x1005901ef v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
10: 0x10055fb34 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
11: 0x1007e7e14 v8::internal::Runtime_AllocateInNewSpace(int, v8::internal::Object**, v8::internal::Isolate*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
12: 0x2cea552dbe3d
13: 0x2cea55937c04
14: 0x2cea5592618b
<--- Last few GCs --->
[51263:0x102804000] 34292 ms: Mark-sweep 1332.4 (1452.5) -> 1320.5 (1453.5) MB, 902.6 / 0.0 ms (average mu = 0.149, current mu = 0.104) allocation failure scavenge might not succeed
[51263:0x102804000] 35480 ms: Mark-sweep 1332.6 (1453.5) -> 1323.6 (1457.5) MB, 1049.3 / 0.0 ms (average mu = 0.131, current mu = 0.116) allocation failure scavenge might not succeed
<--- JS stacktrace --->
==== JS stack trace =========================================
0: ExitFrame [pc: 0x2cea552dbe3d]
Security context: 0x1a4cb371e6e9 <JSObject>
1: next [0x1a4ca627dcd1] [/Users/evhaus/Git/zenhub/client/node_modules/symbol-tree/lib/TreeIterator.js:~16] [pc=0x2cea55937c04](this=0x1a4c807c75b1 <TreeIterator map = 0x1a4c38b8a9c9>)
2: shadowIncludingInclusiveDescendantsIterator(aka shadowIncludingInclusiveDescendantsIterator) [0x1a4ca627a641] [/Users/evhaus/Git/zenhub/client/node_modules/jsdom/li...
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
1: 0x10003d041 node::Abort() [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
2: 0x10003d24b node::OnFatalError(char const*, char const*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
3: 0x1001b8e25 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
4: 0x100586d82 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
5: 0x100589855 v8::internal::Heap::CheckIneffectiveMarkCompact(unsigned long, double) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
6: 0x1005856ff v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
7: 0x1005838d4 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
8: 0x10059016c v8::internal::Heap::AllocateRawWithLigthRetry(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
9: 0x1005901ef v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
10: 0x10055fb34 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
11: 0x1007e7e14 v8::internal::Runtime_AllocateInNewSpace(int, v8::internal::Object**, v8::internal::Isolate*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
12: 0x3e0d8aedbe3d
13: 0x3e0d8b35eedc
<--- Last few GCs --->
[51519:0x102804000] 28074 ms: Mark-sweep 1324.5 (1445.0) -> 1315.7 (1449.0) MB, 760.4 / 0.0 ms (average mu = 0.182, current mu = 0.080) allocation failure scavenge might not succeed
[51519:0x102804000] 28906 ms: Mark-sweep 1328.5 (1449.0) -> 1317.7 (1452.0) MB, 770.4 / 0.0 ms (average mu = 0.129, current mu = 0.074) allocation failure scavenge might not succeed
<--- JS stacktrace --->
==== JS stack trace =========================================
0: ExitFrame [pc: 0x3e0d8aedbe3d]
Security context: 0x3611d141e6e9 <JSObject>
1: queueMutationRecord(aka queueMutationRecord) [0x361185f32321] [/Users/evhaus/Git/zenhub/client/node_modules/jsdom/lib/jsdom/living/helpers/mutation-observers.js:~33] [pc=0x3e0d8b35eedc](this=0x361116e826f1 <undefined>,type=0x3611aa0a3681 <String[9]: childList>,target=0x36110b275a91 <EventTargetImpl map = 0x3611a254a2f1>,name=0x361116e822b1 <null>,name...
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
1: 0x10003d041 node::Abort() [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
2: 0x10003d24b node::OnFatalError(char const*, char const*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
3: 0x1001b8e25 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
4: 0x100586d82 v8::internal::Heap::FatalProcessOutOfMemory(char const*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
5: 0x100589855 v8::internal::Heap::CheckIneffectiveMarkCompact(unsigned long, double) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
6: 0x1005856ff v8::internal::Heap::PerformGarbageCollection(v8::internal::GarbageCollector, v8::GCCallbackFlags) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
7: 0x1005838d4 v8::internal::Heap::CollectGarbage(v8::internal::AllocationSpace, v8::internal::GarbageCollectionReason, v8::GCCallbackFlags) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
8: 0x10059016c v8::internal::Heap::AllocateRawWithLigthRetry(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
9: 0x1005901ef v8::internal::Heap::AllocateRawWithRetryOrFail(int, v8::internal::AllocationSpace, v8::internal::AllocationAlignment) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
10: 0x10055fb34 v8::internal::Factory::NewFillerObject(int, bool, v8::internal::AllocationSpace) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
11: 0x1007e7e14 v8::internal::Runtime_AllocateInNewSpace(int, v8::internal::Object**, v8::internal::Isolate*) [/Users/evhaus/.nvm/versions/node/v10.17.0/bin/node]
12: 0x8d8aedbe3d
<--- Last few GCs --->
[51526:0x102804000] 33125 ms: Mark-sweep 1318.6 (1425.0) -> 1317.7 (1424.0) MB, 874.8 / 0.0 ms (average mu = 0.126, current mu = 0.038) allocation failure scavenge might not succeed
[51526:0x102804000] 33136 ms: Scavenge 1318.5 (1424.0) -> 1318.0 (1424.5) MB, 3.8 / 0.0 ms (average mu = 0.126, current mu = 0.038) allocation failure
[51526:0x102804000] 33148 ms: Scavenge 1318.7 (1424.5) -> 1318.2 (1425.0) MB, 4.2 / 0.0 ms (average mu = 0.126, current mu = 0.038) allocation failure
<--- JS stacktrace --->
==== JS stack trace =========================================
0: ExitFrame [pc: 0x8d8aedbe3d]
1: StubFrame [pc: 0x8d8ae8d40b]
2: ConstructFrame [pc: 0x8d8ae8cfa3]
Security context: 0x3324ecd9e6e9 <JSObject>
3: new NodeImpl(aka NodeImpl) [0x3324c2083e11] [/Users/evhaus/Git/zenhub/client/node_modules/jsdom/lib/jsdom/living/nodes/Node-impl.js:~125] [pc=0x8d8b357fd4](this=0x332437582801 <the_hole>,globalObject=0x3324b10f98e9 <Window map = 0x3324649cf0a1>,args=0x3324b1841471 <JSArray[0]>,...
Too bad 😞 Is that with or without coverage?
That was without coverage. Should have clarified.
Do you think that I'm running a fairly old version of Node (v10) would be factor in this?
You can try using newer versions, if nothing else it'd be an interesting data point. Other things to try is to try to make a heap dump right before it dies. What's on the heap?
It's interesting that nobody mentioned that micromatch@4 doesn't cache regexps anymore (see https://github.com/micromatch/micromatch/pull/151 and https://github.com/facebook/jest/pull/10131 introduces missing caching on jest side.
For me downgrade to micromatch@3 and upgrade to jest-environment-jsdom-sixteen
saved 50% of time.
But with jest 26 and built-in jsdom I'm still getting leaks error when run jest with --detectLeaks in my case. Tried fresh repo and all works well.
https://github.com/facebook/jest/pull/10131 is merged!! <3
Released in 26.1.0, very interested in hearing if it helps folks
@SimenB thanks a lot for the release! unfortunately on our side we're still facing a huge difference:
current:
os: osx
node: 12.6.1
jest: 24.9
-----------------
174 test suites
823 tests
322 snapshots
-----------------
done in 23.569s
on every version higher than 24.9
os: osx
node: 12.6.1
jest: 26.1.0
-----------------
174 test suites
823 tests
322 snapshots
-----------------
done in 133.763s
both with fresh caches and after a complete reinstall of all node modules. untouched configs. running tests on watch mode it takes more than 3min on my machine to determine which tests to run. i can't really isolate the issue for a reproduction but if you give me some advices what to test, i'd be very interested. thanks for all the work you put into that!
@SimenB
For prettier
project, still slow than v24
when collecting coverage.
I can confirm that versions 25 and 26 performances are lower than 24 on Bitbucket Pipeline. I've also noticed that the slowness increases with coverage enabled. Version 25 gets even worse compared to 26 and the pipeline crashes due the memory consume.
@SimenB I did a git bisect
to find out where the performance regression between 24.9 and 25.1 was introduced. I used the prettier tests because they run without modification all the way from 24.9 to 26.1.
I compared the accumulated runtime of three runs of the js subset (to safe some time) with disabled cache. More specially the command I use was (yarn run jest --no-cache tests/js/
) with node 10.19. because node 10 was the recommended version for 24.9.
Results:
24.9.0-dev 3cdbd556948b4974b2cc23178977eb159d343df8 151.84s <- Good
25.1.0-dev 5dcc48075f22d581864f381f20bc8b257d2a73cd 223.29s <- Bad
24.9.0-dev bf6109591365a2b71c7b642fa33ed06d3db6cb26 122.58s
24.9.0-dev 77c3ceedfd61ddc841e11fec7b76e540924d3e60 120.42s
24.9.0-dev 30e08e9ae7d7d78f40df757c2ec4b49357285eda 221.28s
24.9.0-dev ad5377333daf6716af3465bba39f86b7db485e2b 222.33s
24.9.0-dev 8ddadfca76eb3fb979df81033d3d0ff564c415d6 120.32s
24.9.0-dev 966f22f2a08d9ac9300b6964ab91c4e75dba4761 120.46s
24.9.0-dev b9084101189639524863e15ef7557ea6bc6704b9 119.87s
24.9.0-dev 1d8245d77d47b4198d51e55da87893d7dfe1a759 129.93s
ad5377333daf6716af3465bba39f86b7db485e2b is the first bad commit
commit ad5377333daf6716af3465bba39f86b7db485e2b
Author: Simen Bekkhus <sbekkhus91@gmail.com>
Date: Mon Dec 2 23:20:16 2019 +0100
feat: add support for `compileFunction` allowing us to avoid the module wrapper (#9252)
Since there is a fallback if compileFunction
is not defined I removed the compileFunction
branch from ad5377333daf6716af3465bba39f86b7db485e2b which restored the performance.
Looking at 26.1 the code has moved around a bit but compileFunction
and the fallback are still there. So:
26.1.0-dev 817d8b6aca845dd4fcfd7f8316293e69f3a116c5 242.99s <- with compileFunction
26.1.0-dev 817d8b6aca845dd4fcfd7f8316293e69f3a116c5 151.61s <- without compileFunction
i.e. removing the compileFunction
branch (patch) brings 26.1 back to the runtime of 24.9. I’m sure that is not the solution, but at least we have something to work with.
As another data point, our jest suite currently takes around 2767 seconds with jest@24.8.0 and in our update MR it takes around 3497 seconds, an increase around 27%.
Thanks for all the great work, jest team, I hope that the detective skills of @wurstbonbon can help you fixing that regression!
@wurstbonbon thanks for taking the time to dig! Very interesting that compileFunction
is slower... That should mean that you can just use a custom test environment instead of applying a patch.
const NodeEnv = require('jest-environment-node');
class MyNodeEnv extends NodeEnv {}
delete MyNodeEnv.prototype.compileFunction;
module.exports = MyNodeEnv;
(and the same for jsdom env). Can you confirm?
It being such a bottleneck sounds weird though - Node itself switched to using it 18 months ago: https://github.com/nodejs/node/pull/21573. So it's probably something weird we're doing on our side. The linked https://github.com/nodejs/node/issues/26229 is very interesting though - maybe we need to do some more caching on our side?
@SimenB i just tried something similar to that custom env and it looks like it was a little better (but still slower than jest 24).
I had to do MyNodeEnv.prototype.getVmContext = null;
, though, because I'm testing with jest 26, and it looks like it checks for if (typeof this._environment.getVmContext === 'function') {
now. Not sure if this could cause other issues, though.
These are the results i'm seeing after a few runs:
Jest 26 w/testEnvironment: "node" => ~280s Jest 26 w/custom test environment => ~210s Jest 24 => ~160s
Let me know if i can help with any other information or something else!
As expected the custom env results in the same speedup for prettier.
I've also tried it on our codebase where the difference is ~270s vs ~200s so only about 25% and not 40% reduction. Unfortunately I can't run our tests with jest 24 because we rely on the new modern timers mocking.
I missed a delete
in my example above, sorry about that.
I wonder if it's enough to just cache the compiled functions manually - could you try to apply this patch? (both transpiled JS and the TS source included here)
EDIT: nope, this breaks horribly 🙈 I've asked in the Node issue if it's possible to populate the compilation cache 🤞
I thought this might do the trick.
const params = this.constructInjectedModuleParameters();
const cacheKey = transformedCode + params;
const cachedData = compileFunctionCache.get(cacheKey);
try {
compiledFunction = (0, _vm().compileFunction)(
transformedCode,
params,
{
filename,
parsingContext: vmContext,
cachedData,
produceCachedData: !cachedData,
},
);
if (compiledFunction.cachedDataProduced) {
compileFunctionCache.set(cacheKey, compiledFunction.cachedData);
}
} catch (e) {
throw (0, _transform().handlePotentialSyntaxError)(e);
}
It improves performance a little but Script
is still a lot faster.
Tried the recommendation from @SimenB: https://gitlab.com/gitlab-org/gitlab/-/merge_requests/33252/diffs?commit_id=6d633c88caf70f712fa0ccaac42d952976161ec6
While it did improve performance a bit, it is still considerably slower than on jest 24.x:
@leipert Have you by any chance tried downgrading the jsdom environment to 14?
yarn add test-environment-jsdom-fourteen --dev
+ "testEnvironment": "test-environment-jsdom-fourteen"
in your jest config. This still seems to be responsible for the bulk of the duration increase for us (adds 40-50%) but it's starting to look like there's multiple regressions in play.
@pleunv With jest 24.x we are on jsdom 16 with jest-environment-jsdom-sixteen
, we had to upgrade due to some issues with testing web components. So the only change we do: jest 24.x + jest-environment-jsdom-sixteen
-> jest.26x + jest-environment-jsdom
, so the jsdom version doesn't even change.
Opened up https://github.com/nodejs/node/issues/35375 upstream about the issue found by @wurstbonbon
@SimenB are you aware of any viable alternatives to micromatch? That repo has been silent for over half a year now, and major issues affecting Jest like https://github.com/micromatch/micromatch/issues/179 are still open.
Not really, it's what most libraries use. Could look at e.g. minimatch, but I doubt it'd be viable
@SimenB What makes micromatch better than all the alternatives?
Based on the feedback in the issue I've opened, I'm thinking we should revert to using Script
for now as it seems it's gonna need a bit of work in Node to fix it there.
@leipert @wurstbonbon or anyone else, can you try this patch in your node_modules/jest-runtime/build/index.js
?
I'll need to tweak how v8 code coverage works, but I'll try to open a PR tomorrow or next week.
I tested out the patch to use Script
on our test suites and here's the result I came to
Time is in min:sec
Name | Suite 1 | Suite 2 | Suite 3 | Suite 4 |
---|---|---|---|---|
jest 24 | 3:25 | 3:30 | 3:29 | 0:53 |
jest 26 patched | 3:32 | 4:36 | 3:48 | 0:53 |
jest 26 unpatched | 5:10 | 6:12 | 5:11 | 1:07 |
26 patched vs 24 | 4% | 31% | 9% | 1% |
26 unpatched vs 24 | 52% | 76% | 49% | 27% |
26 patched vs unpatched | 46% | 35% | 36% | 25% |
All tests ran on same commit & similar test environment (Azure DevOps Hosted Ubuntu 18) I only took the time taken for running jest on my test suites. Most of my suites are similar in nature (all backend unit tests)
From what I can tell, the patch to use Script
does make a huge difference on the perf.
I can't tell if the slowdown on Suite 2
is an outlier or a real regression (only did 4 runs).
It does look like there's still a perf regression, but not as bad
still interesting that v26 still doesn't improve on v24...
Thanks @Cellule! That's good enough for me - I'll put together a PR when I have some time
Awesome stuff! That leaves only the Micromatch issue then, hopefully that repo will come under active maintenance again.
BTW, there is also perf regression in JSDOM I assume. As I did such tests on a big web project. No patches mentioned above are applied. And it looked like that.
Jest 24 (testEnvironment: "jsdom") (no rewires latest CRA)
144.014s
Jest 24 (testEnvironment: "jest-environment-jsdom-sixteen") (rewire latest CRA that changes testEnvironment)
409.473s (also few failed tests)
Jest 26 (testEnvironment: "jsdom") (no rewires latest CRA) so old jsdom? Whatever is the default for Jest 26 I assume? (I used react-app-rewired to rewire jest config and pnpmfile.js to override what version of Jest was installed with `react-scripts` as it still ships Jest 24 (like resolutions in yarn))
310.275s
Jest 26 (testEnvironment: "jest-environment-jsdom-sixteen") (rewire latest CRA that changes testEnvironment + pnpmfile.js)
over 1200s+ (some tests failed plus test run just stuck forever)
Surely this is a super vague and unstable performance report I must stay, but I think every input helps :)
https://github.com/facebook/jest/releases/tag/v26.5.0 has the vm.Script
change discussed here
💥 Regression Report
we upgraded jest from 24 to 25 and saw our unit tests go from taking 5min 23sec in jenkins to now over 11 minutes. only a few snapshot tests broke in the upgrade, we
-u
'd them, but this is a severe regression imo. please help me understand how we can fix this. We clear cache in CI to ensure we always run the latest.A clear and concise description of what the regression is. run time went from 5:23, to 11:00
Last working version
24.8.0 Worked up to version: 24.8.0 Stopped working in version: 25.1.0
To Reproduce
sorry can't share our codebase Steps to reproduce the behavior:
Expected behavior
A clear and concise description of what you expected to happen.
Link to repl or repo (highly encouraged)
Please provide either a repl.it demo or a minimal repository on GitHub.
Issues without a reproduction link are likely to stall.
Run
npx envinfo --preset jest
Paste the results here: