One idea is to chunk the data into equal sized chunks, e.g, a 1000 row data.frame into 10 chunks of 100 each. we do this in elastic. CouchDB doesn't do ND-JSON though unfortunately.
right now, trying to write flights dataset takes a long time and then fails
invisible(db_bulk_create(x, "flights", flights))
#> Error: (413) - the request entity is too large
speedups and fixes for large data
One idea is to chunk the data into equal sized chunks, e.g, a 1000 row data.frame into 10 chunks of 100 each. we do this in
elastic
. CouchDB doesn't do ND-JSON though unfortunately.right now, trying to write flights dataset takes a long time and then fails