SheetJS / sheetjs

📗 SheetJS Spreadsheet Data Toolkit -- New home https://git.sheetjs.com/SheetJS/sheetjs
https://sheetjs.com/
Apache License 2.0
35k stars 8k forks source link

parse_dom_table function performance #1626

Open ThomasChan opened 5 years ago

ThomasChan commented 5 years ago

https://github.com/SheetJS/js-xlsx/blob/master/xlsx.js#L19061

The merges for loop will became extremely large and slow while handling large table, i had test this with 204 cols * 250 rows table, without optimize ws[!merges] result a huge array, and almost of item is merging single cell itself which is useless.

before optimize, in my test case, export function excute 18.6s, and after, only excute 4.67s.

and in my customer's client, they were exporting a 193 cols * 1277 rows table, export function excute 6mins, after optimize, only excute 15s.

table

before

after

code change https://github.com/henglabs/js-xlsx/commit/febac23e8e0c1cc12efe81ef7a7d1d369b55f352

ThomasChan commented 5 years ago

well, my code change above was wrong, and i figured out that merges for loop's logic then made another change, decrease time complexity from O(merges.length * n) to O(merges.length).

屏幕快照 2019-09-18 下午4 37 44

for export 200 cols * 10000 rows table, parse_dom_table function don't out-of-memory any more, but jszip utf8ToBytes function got out-of-memory.

SheetJSDev commented 5 years ago

@ThomasChan thanks for looking into this, and feel free to submit a PR.

There's definitely room for improvement. The weird loop is done that way to address a case like:

A1:C2 D1:E2 F1
F2

The first cell in the second row should be located at F2, but to determine that you need to look at the A1:C2 merge first then the D1:E2 merge.

The implementation was designed expecting only a small number of merges. IF you have many, then the approach is extremely slow.

Given the parse order, it will always be sorted by starting row then by starting column. To reduce it to a single walk through the merge array, you might be able to sort by ending row then by starting column (sorting the array with a custom sort function). Then you'd keep track of a starting index into the array (elements before that point could never affect the result, so you can skip them).

So that we have a sense for the performance, can you share a sample table that you are trying to convert?

ThomasChan commented 5 years ago

Thanks for reply, i had submit a PR, and sample table is nothing special to any html table, you can create a BIG table like i said above, 200 cols * 10000 rows, then use xlsx to export like

 var title = 'Big Table';
  var writeOptions = {
    Props: {
      Title: title,
      CreatedDate: new Date().toLocaleString(),
    },
    type: 'binary', // if not use binary will out-of-memory
  };
  var wb = XLSX.utils.book_new();
  var ws = XLSX.utils.table_to_sheet(t, {
    dense: true, // no dense also out-of-memory
    raw: true, // no raw will slow down performance
  });
  XLSX.utils.book_append_sheet(wb, ws, title);
  XLSX.writeFile(
    wb,
    `${title}.xlsx`,
    writeOptions,
  );