Closed milesj closed 4 years ago
This is using Jest 24 beta btw.
Have you tried limiting workers?
With 4 workers its still 190.84s but it seemed to crash in Node 11. https://travis-ci.org/milesj/boost/jobs/481648087 Gonna try 2 workers.
With -i
its 450.86s.
Not sure either of those help me right now.
Unless you're paying, 2
is max. However, runInBand
should be faster... Odd it only makes it horribly worse
Can you try to remove detectOpenHandles
? It uses async_hooks
, which has quite the atrocious performance overhead
Turned off detectOpenHandles
and using 2 workers. Still pretty slow on Node 10 at 284.33s. I wonder if it's because I'm running Jest in a child process.
I also noticed a ton of 7
s being output, which is super weird. It's definitely not coming from my code, but from the Jest process. Have yet to figure out where it's coming from.
// Received
..............................77777777777777777777777777777777777777..
// Should be
................................
Running not in a child process is the same speed.
@milesj are you using the jsdom env? I'm thinking this may be the cause
@mattfysh Good call, that slipped my mind. Since I haven't defined it, it's defaulting to jsdom. I'll try it without.
Using node env crashed Travis. Using node env + 2 workers took about 282.66s. Still seems rather slow. This is weird.
Any other diagnostics I could run to try and debug this?
Tests taking 764.268s now. https://travis-ci.org/milesj/boost/jobs/484611089
Possible memory leak or something else?
I wonder if it's because I'm running Jest in a child process.
Hmm, maybe. Would you be able to reduce it down to something that's slow locally as well?
Make sure to pass the --max-workers
and/or --run-in-band
down to the spawned child process
Possible memory leak or something else?
You can run with --detect-leaks
and see if Jest itself thinks something is leaking
Trying that, will report back. In the mean time, I've been going through my CI builds and this commit seemed to double the time it takes.
https://github.com/milesj/boost/commit/cd1c50019541f04e3acb7f55b3a9c5d21c53bcf3
But nothing seems out of the ordinary. This is slightly around when I started using Jest 24 alpha, but not sure if that's a red herring or not.
Also this one, which changes the istanbul comments. But that also seems like a stretch.
https://github.com/milesj/boost/commit/fa5690c84e0fcc5e0865b89e9dfc421efe11f319
Memory leak stuff seems to be spurious. Not entirely useful. https://travis-ci.org/milesj/boost/jobs/484960305
It's failing on the most basic of tests: https://github.com/milesj/boost/blob/master/packages/core/tests/i18n/LanguageDetector.test.ts / https://github.com/milesj/boost/blob/master/packages/core/src/i18n/LanguageDetector.ts
Can you reproduce the error/slowness in a smaller repo? There's so much stuff code there, it's hard to tell what's going on.
Potentially, just a branch in the same repo which deletes most of the code but still shows it being slow
Yeah I'll try and narrow it down if I can.
I wonder if this is also relevant: https://github.com/facebook/jest/issues/7732
Then it should be slow locally as well. Might very well be the reason, of course - if you have a bajillion files, the relatively weak CPU of travis will be slow 🙂 Shouldn't be that slow though
I've noticed that imports to execa
trigger the detect leaks error. Here's an example:
[20:32:25] Miles:boost > NODE_ENV=test BOOST_ENV=test npx jest -i --detectLeaks --detectOpenHandles ./packages/core/tests/i18n/
PASS packages/core/tests/i18n/FileBackend.test.ts
PASS packages/core/tests/i18n/LanguageDetector.test.ts
Test Suites: 2 passed, 2 total
Tests: 4 passed, 4 total
Snapshots: 0 total
Time: 0.965s, estimated 1s
Ran all test suites matching /.\/packages\/core\/tests\/i18n\//i.
[20:33:02] Miles:boost > NODE_ENV=test BOOST_ENV=test npx jest -i --detectLeaks --detectOpenHandles ./packages/core/tests/i18n/
PASS packages/core/tests/i18n/FileBackend.test.ts
FAIL packages/core/tests/i18n/LanguageDetector.test.ts
● Test suite failed to run
EXPERIMENTAL FEATURE!
Your test suite is leaking memory. Please ensure all references are cleaned.
There is a number of things that can leak memory:
- Async operations that have not finished (e.g. fs.readFile).
- Timers not properly mocked (e.g. setInterval, setTimeout).
- Keeping references to the global scope.
at node_modules/jest-cli/build/TestScheduler.js:240:24
Test Suites: 1 failed, 1 passed, 2 total
Tests: 3 passed, 3 total
Snapshots: 0 total
Time: 0.99s, estimated 1s
Ran all test suites matching /.\/packages\/core\/tests\/i18n\//i.
Going through execa now. Maybe the leak there is causing issues.
Found some timers that weren't faked and it brought it down to 370s. Also fixed the open handle issue.
https://github.com/milesj/boost/pull/32/files
Think this is as good as it'll get for now.
I'm using the local node + chrome debugger to profile Jest. Just gonna post some screenshots.
Jasmine promises / queue runner seem to take the most time.
execa
& env-ci
take a lot of time with their child processes (I can probably mock these).
An async call in my Tool
s constructor may be the big issue. I'll try and lazy-load this.
Gonna close this.
This issue has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs. Please note this issue tracker is not a help forum. We recommend using StackOverflow or our discord channel for questions.
🐛 Bug Report
When running Jest locally, all tests finish in 12 seconds. When running in Travis CI, they finish in 448 seconds. Here's the local output:
And the CI output.
Notice in the CI that
BoostReporter.test.ts
takes 161.036s,NyanReporter.test.ts
takes 170.847s, andConsole.test.ts
takes 90.95s, while the local testing does not even hit the slow timing threshold. The heap usage is relatively the same between both.The one common denominator between the reporters is that they use chalk and have many tests that compare chalk output. For example:
However, I've tried disabling chalk using
chalk.enabled = false
, but the timings did not change.Another possible culprit is the console mocking that I do here:
But still at a loss at why that would cause these issues.
To Reproduce
Steps to reproduce the behavior:
yarn install
yarn run setup
yarn run jest
Expected behavior
CI tests not to be so slow.
Link to repl or repo (highly encouraged)
Example CI builds can be found here: https://travis-ci.org/milesj/boost/builds/481201444
Run
npx envinfo --preset jest
Local:
CI: