testing-library / dom-testing-library

🐙 Simple and complete DOM testing utilities that encourage good testing practices.
https://testing-library.com/dom
MIT License
3.26k stars 466 forks source link

performance improvement #1183

Closed rluvaton closed 1 year ago

rluvaton commented 1 year ago

What: just performance improvment

Why: spreading the big array can be costly

Checklist:

codesandbox-ci[bot] commented 1 year ago

This pull request is automatically built and testable in CodeSandbox.

To see build info of the built libraries, click here or the icon next to each commit SHA.

Latest deployment of this branch, based on commit 0c323b57a37e46d7b56dc8054770b41236146308:

Sandbox Source
react-testing-library-examples Configuration
codecov[bot] commented 1 year ago

Codecov Report

Merging #1183 (0c323b5) into main (edffb7c) will not change coverage. The diff coverage is 100.00%.

@@            Coverage Diff            @@
##              main     #1183   +/-   ##
=========================================
  Coverage   100.00%   100.00%           
=========================================
  Files           24        24           
  Lines          998      1003    +5     
  Branches       326       330    +4     
=========================================
+ Hits           998      1003    +5     
Flag Coverage Δ
node-12 100.00% <100.00%> (ø)
node-14 100.00% <100.00%> (ø)
node-16 100.00% <100.00%> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
src/queries/text.ts 100.00% <100.00%> (ø)

:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more

rluvaton commented 1 year ago

I did not measure this specifically, but measured the merge array options:

The code ```javascript const b = require('benny') const { faker } = require('@faker-js/faker'); function generateArr() { return Array.from({ length: faker.datatype.number({ min: 25, max: 200, }), }, () => ({})); } const size = 50; const allOptions = Array.from( { length: size }, () => { const arr = [generateArr(), faker.datatype.boolean() ? [] : [faker.datatype.number()]]; arr.push(faker.helpers.arrayElements(arr[0], faker.datatype.number({ min: 1, max: 10 }))); return arr; }, ); let index = 0; b.suite( 'Merge Arrays', b.add('current - spread both', function () { const [big, small] = allOptions[index % size]; const data = [...small, ...big]; index++; }), b.add('concat', function () { const [big, small] = allOptions[index % size]; const data = small.concat(big) index++; }), b.add('concat smaller array', function () { const [, small, bigAfterFilter] = allOptions[index % size]; const data = small.concat(bigAfterFilter); index++; }), b.cycle(), b.complete(), ) ```

The results:

  current - spread both:
    2 505 686 ops/s, ±0.56%    | slowest, 77.29% slower

  concat:
    8 202 593 ops/s, ±0.20%    | 25.67% slower

  concat smaller array:
    11 034 817 ops/s, ±0.09%   | fastest

Finished 3 cases!
  Fastest: concat smaller array
  Slowest: current - spread both

The concat smaller array is suppose to be the elements array after filtering

If you have any questions about the benchmark code, why I did something, I'm here 😄

eps1lon commented 1 year ago

Benchmarks of the library are relevant here not synthetic benchmarks of similar code.

rluvaton commented 1 year ago

The reason why I did not make a library benchmark is that I wanted to avoid any possible caching so I'll have more realistic results and test different queries...

I'll try to create one now...

rluvaton commented 1 year ago

After running the application benchmark I did not saw any difference which is weird but closing anyway 🤷🏻‍♂️