glin / reactable

Interactive data tables for R
https://glin.github.io/reactable
Other
612 stars 79 forks source link

Error: heap limit reached #287

Closed werkstattcodes closed 1 year ago

werkstattcodes commented 1 year ago

I am working on a quarto document (VS Code) which includes a reactable tablewith relative verbose text content (around 80000 rows, each containing a paragraph of text, the file underlying the table has around 100 MB as a csv). I am able to render the table when feeding in only a sample of rows.

However, when rendering the full file/all rows, the process at one point is aborted with the error message

  |......................................................................| 100%
  ordinary text without R code

output file: scr_analysis.knit.md

There were 12 warnings (use warnings() to see them)

<--- Last few GCs --->

[19168:0000023F0794AD00]   222310 ms: Mark-sweep 1382.8 (1426.1) -> 1382.6 (1425.9) MB, 7.7 / 0.0 ms  (average mu = 0.999, current mu = 0.938) allocation failure; scavenge might not 
succeed
[19168:0000023F0794AD00]   222328 ms: Mark-sweep (reduce) 1398.2 (1441.4) -> 1398.2 (1409.9) MB, 10.7 / 0.0 ms  (+ 0.0 ms in 0 steps since start of marking, biggest step 0.0 ms, walltime since start of marking 11 ms) (average mu = 0.998, current mu = 0.39

<--- JS stacktrace --->

#
# Fatal javascript OOM in Reached heap limit
#

I assume it's related to the V8 library, and probably this issue.

Any idea how to overcome this issue?

Many thanks!

werkstattcodes commented 1 year ago

I am closing this, since it is most likely related to quarto.

glin commented 1 year ago

reactable does use V8, but only for the new static rendering feature in the development version. Is your table using reactable(static = TRUE) by any chance? If so, you can certainly run into the V8 heap limit (4 GB by default I think) through that. If not, then it probably is Quarto.

If you have any custom render functions in R, you might be able to cut memory use of the table by switching to JavaScript functions. But I'm also surprised that you would hit a memory limit with only a 100 MB csv file, hmm.