Closed pl0xz0rz closed 1 year ago
hello @pl0xz0rz I propose to create a realistic case and measure the CPU time and data volume for our 2 realistic use cases.
Realistic becnamark - drawRDFGasGain.ipynb
arrayCompression=arrayCompressionRelative8
dfSample=df[0:5000000].sample(frac=0.2).sort_index()
fig=bokehDrawSA.fromArray(dfSample, "qVector>0", figureArray, widgetParams, layout=figureLayoutDesc, sizing_mode='scale_width', nPointRender=50000, widgetLayout=widgetLayoutDesc,
parameterArray=parameterArray, histogramArray=histoArray, rescaleColorMapper=True, arrayCompression=arrayCompressionRelative16,aliasArray=aliasArray,palette=kBird256)
CPU times: user 12.1 s, sys: 104 ms, total: 12.2 s
Wall time: 12.2 s
CPU times: user 8.67 s, sys: 40 ms, total: 8.71 s
Wall time: 8.69 s
-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
------------------------------------------------------------------------------------------------ JSON report ------------------------------------------------------------------------------------------------
report saved to: test6.json
================================================================================= 37 passed, 7 warnings in 62.48s (0:01:02) =================================================================================
real 1m3.957s
user 13m7.152s
sys 0m47.144s
Test after commit fix- timing is as before with the optimized version(https://github.com/miranov25/RootInteractive/pull/292/commits/636fee0ab72a3e8b00b87345038c677e233c24b2)
CPU times: user 8.94 s, sys: 72 ms, total: 9.01 s
Wall time: 9 s
This PR improves the efficiency of compression by
Observed results in test_Compression.py (1 million points) Relative rounding: CPU time on server reduced by 50%, file size reduced by 10%, CPU time for decompression unchanged Absolute rounding: CPU time on server reduced by 70%, file size reduced by 20%, forgot to measure CPU time on client for decompression