callstack / reassure

Performance testing companion for React and React Native
https://callstack.github.io/reassure/
MIT License
1.29k stars 28 forks source link

[FEATURE] Support Running With Vitest Instead of Jest #421

Closed ckifer closed 11 months ago

ckifer commented 1 year ago

Is your feature request related to a problem? Please describe. I'd rather not need to have a jest setup to use reassure when I currently use vitest for testing

Describe the solution you'd like Support configuring reassure with vitest or jest at consumers preference

Describe alternatives you've considered N/A

Additional context There is already a TEST_RUNNER_PATH env variable, but pointing that to the vitest binary complains about needing WASM

"test:perf": "TEST_RUNNER_PATH=./node_modules/.bin/vitest reassure",

ReferenceError: WebAssembly is not defined
    at file:///workplace/ckifer/recharts/node_modules/vite/dist/node/chunks/dep-2b82a1ce.js:16497:1125
    at ModuleJob.run (node:internal/modules/esm/module_job:193:25)
    at async Promise.all (index 0)
    at async ESMLoader.import (node:internal/modules/esm/loader:530:24)
    at async main (file:///workplace/ckifer/recharts/node_modules/vitest/dist/cli-wrapper.js:61:5)
mdjastrzebski commented 1 year ago

@ckifer this is a great idea, first step into making it work would be have to make sure that RNTL / RTL run correctly with vitest, as Reassure is built on top of these libraries. Are you interested in web React or React Native testing?

Do you have capacity to verify whether RTL / RNTL support vitest? And/or contribute missing config/etc?

ckifer commented 1 year ago

For me mostly web testing.

I have no "proof" atm that vitest is compatible with RTL but for me all functions and features work out of the box once vitest itself is set up. I think RTL is the main testing library people are using right now with vitest and react so should be pretty well supported

ckifer commented 1 year ago

RTL calls out in their docs that they recommend using vitest when making svelte applications https://testing-library.com/docs/svelte-testing-library/setup/#vitest

That gives me enough confidence these are completely compatible

mdjastrzebski commented 1 year ago

@ckifer would you be able to confirm this empirically by running sample test with vitest & RTL? (Svelte probably uses Svelte Testing Lib or something).

Regarding UX how would you see such an option? Something like "reassure --vitest" / "reassure --runner vitest" or some other way?

ckifer commented 1 year ago

@mdjastrzebski recharts just moved to vitest from jest and we run all tests with RTL

https://github.com/recharts/recharts/actions/runs/6605228230/job/17940180193

The second option seems fine - add a flag for runner, default it to jest, and have vitest as an option

mdjastrzebski commented 1 year ago

@ckifer do you have capacity to contribute such feature as a PR?

ckifer commented 1 year ago

@mdjastrzebski no not at the moment 😅

Tagging a few people interested in this though: @nikolasrieble @PavelVanecek @akamfoad @branberry

branberry commented 1 year ago

@mdjastrzebski @ckifer I should have some bandwidth to work on this!

mdjastrzebski commented 1 year ago

@branberry go ahead!

branberry commented 1 year ago

Hey y'all! Quick update. I have been working on this issue, and I still ran into the WebAssembly import error. I narrowed this down to when we call spawnSync. Specifically, the --jitless flag seems to cause this error to occur when using Vitest. When removing it, it appears to run just fine, but I am not sure what the ramifications are if we remove this.

I can conditionally remove the --jitless flag for Vitest as a workaround.

Thoughts @mdjastrzebski?

mdjastrzebski commented 1 year ago

Does Vitest run with on v8 JS engine (node)?

If so the goal of --jitless flag is make all of the runs be comparable, instead of Just In Time compilation kicking in during later runs and cause distortions due to: 1) pausing for JIT compilation 2) faster execution when using compiled code

From that perspective using --jitless or achieving similar effect in some other way.

ckifer commented 1 year ago

Vitest is definitely running on node

At some point the code must get here and try to load WASM..

Would be interesting to debug the vitest/vite code or cut vite an issue

mdjastrzebski commented 1 year ago

Seems like WASM might need some compilation. Based on the V8 docs, it seems we might want to use Liftoff compiler (non-optimizing) vs Turbo Fan (optimizing) in order to keep execution time stable (potentially excluding the initial execution which we already discard with Jest as well): https://v8.dev/docs/wasm-compilation-pipeline#flags-for-experimentation

@branberry could you add proper flags flags depending on using vitest (probably --liftoff --no-wasm-tier-up but please experiment locally with stability and share your results).

mdjastrzebski commented 1 year ago

Relevant description of the flags from ChatGPT:

Now, regarding the specific flag --no-wasm-tier-up: In V8, there is a concept called tiering, which is used in the context of WebAssembly and also for JavaScript (in the form of the JIT compiler). For WebAssembly, V8 has a two-tier compilation system:

Baseline compiler (Liftoff): This compiles WebAssembly code quickly with a baseline level of optimization, allowing the code to start running as soon as possible.

Optimizing compiler (TurboFan): This takes more time to compile the code but produces more optimized machine code, which should run faster.

The tier-up process is when V8 initially compiles the WebAssembly code using the baseline compiler and then recompiles hot functions (those that are frequently executed) with the optimizing compiler in the background. Once the optimized code is ready, V8 swaps the baseline-compiled code with the optimized code to achieve better performance.

The --no-wasm-tier-up flag is used to disable this tier-up behavior. When this flag is passed to V8, it will compile WebAssembly code using the baseline compiler and will not attempt to recompile it with the optimizing compiler, regardless of how often functions are called. This means that the initial compilation will be faster, but the executed code may be less optimized and thus slower during execution.

branberry commented 1 year ago

I tried using those experimental flags locally, but I ran into the same error:

image

I think the issue lies with the fact that the --jitless flag doesn't support WebAssembly.

Based on the release notes for JIT-less execution, it appears that this is the case:

image

It looks like WebAssembly cannot be interpreted, and is not in context when trying to run the code with the --jitless flag. I think at this point we might need to open an issue with Vitest to see if we can run it without using WebAssembly, or temporarily use Reassure with Vitest, but not use the --jitless flag. I don't think the latter is a great option because of the issues you mentioned @mdjastrzebski.

Although, based on the code you shared @ckifer, it looks like the loadWebAssemblyModule only is used for the ESM imports. I wonder if we could switch to using the CommonJS approach to avoid using WebAssembly.

mdjastrzebski commented 1 year ago

What is the purpose of using WASM in Vitest?

branberry commented 1 year ago

That I don't know, unfortunately. It's a part of their implementation as far as I can tell, but I'll have to do more research to see if there is a way to disable WebAssembly when using Vitest.

EDIT: Upon further investigation, I believe the problem stems from the Vite itself which depends on the es-module-lexer library. When debugging the exact location of the error, the transpiled JS mentions this as the source of where WASM is being used. Looking at the Vite source code, the es-module-lexer is used throughout.

To summarize, Vitest depends on Vite, which depends on the es-module-lexer. The es-module-lexer requires WebAssembly to run.

image

image

mdjastrzebski commented 11 months ago

I think that there might be an option to swap --jitless for some other option that would disable the optimizing compiler. I'm not sure how Vitest works internally, but --no-opt --no-sparkplug sounds promising, as the first would should disabled Turbo Fan optimizing compiler, and the second one would disable Sparkplug non-optimizing compiler.

@branberry do you have the capacity to verify whether it works correctly in your case?

ckifer commented 11 months ago

@mdjastrzebski would we still be able to add a flag for the different test runner?

mdjastrzebski commented 11 months ago

@ckifer pls check if the --enable-wasm option to reassure resolves the issue for you.

BTW This is not passing different flags to test runner, but rather to Node.js process itself which then executes the test runner.

ckifer commented 11 months ago

Okay so this works

TEST_RUNNER_PATH=./node_modules/.bin/vitest TEST_RUNNER_ARGS='--config=\"./vitest.reassure.config.ts\"' reassure --baseline --enable-wasm

vitest.reassure.config.ts is

export default defineConfig({
  plugins: [react()],
  test: {
    environment: 'jsdom',
    globals: true,
    setupFiles: ['test/vitest.setup.ts'],
    exclude: ['react-smooth', 'node_modules', 'dist', '.idea', '.git', '.cache', 'build'],
    include: ['test/**/*.perf-test.tsx'],
    threads: false,
    watch: false,
  },
});

I do however get the error

❌ Measure code is running under incorrect Node.js configuration.
Performance test code should be run in Jest with certain Node.js flags to increase measurements stability.
Make sure you use the Reassure CLI and run it using "reassure" command.

But after running baseline and then without baseline we get:

✅  Written Current performance measurements to .reassure/current.perf
🔗 /workplace/ckifer/recharts/.reassure/current.perf

❇️  Performance comparison results:
 - Current: fix/responsive_ref (40259555ca6d0d04f9a1ae68c26b2550e228ddeb) - 2023-12-05 18:44:40Z
 - Baseline: fix/responsive_ref (40259555ca6d0d04f9a1ae68c26b2550e228ddeb) - 2023-12-05 18:41:57Z

➡️  Significant changes to duration

➡️  Meaningless changes to duration
 - test/chart/AreaChart.perf-test.tsx > AreaChart > Simple test [render]: 20.8 ms → 23.0 ms (+2.2 ms, +10.7%)  | 2 → 2 

➡️  Count changes

➡️  Added scenarios

➡️  Removed scenarios

✅  Written JSON output file .reassure/output.json
🔗 /workplace/ckifer/recharts/.reassure/output.json

✅  Written output markdown output file .reassure/output.md
🔗 /workplace/ckifer/recharts/.reassure/output.md

🚀

@branberry FYI if you want to add this back to recharts

@mdjastrzebski maybe if not adding an option for a different test runner we could at least add these steps to the docs

branberry commented 11 months ago

@ckifer, thanks for the update! I should have some time this week to work on adding it back in.