seankross / mario

🍄 But our princess is in another castle!
Other
21 stars 1 forks source link

put a hard upper cap on the size of data frames to be encoded to JSON, and bail with an error if over that cap #41

Closed pgbovine closed 2 years ago

pgbovine commented 2 years ago

We should put a hard upper limit on how big the data frames mario is willing to encode to JSON, and issue an error if the data is too big. That's because JSON encoding takes forever with large data frames, and it could also run out of RAM. Doing so will make large dataframes like diamonds fail right away instead of churning for a while on the server (and risking crashing the server altogether), only to timeout and fail after 15-20 seconds, etc.:

library(dplyr)
library(ggplot2)
diamonds %>% arrange(carat)

object_size might do the trick, but there may be issues with aliasing, etc. still, that could be a good first-pass: http://adv-r.had.co.nz/memory.html#object-size

also, during the course of doing this, could implement this optimization to save more space too: https://github.com/seankross/mario/issues/28