We can't send all results to the browser at once. Too much data overwhelms the browser.
We need Additional apis:
/api/queryasync which starts the query and returns query id. Args: query string. Result:
{
"query_id": 15
}
Optional args: max_rows to set a limit on number of rows, max_data_size to set a limit on size of data.
/api/queryasyncstatus which asks for the status of the query given id. Args: query id. Result:
{
"rows_count": 123,
"data_size": 8234, // in bytes, approximate, we calc by adding size of all values
"finished": false,
"time_to_first_result_ms": 34, // time it took to get the first result from the server
"total_query_time_ms": 1234, // time it took to get all results
"columns": [ "id", "name" ],
"error": null, // string if there was an error
}
Front-end will ask for status until finished is true. At first we can just buffer data in memory. More sophisticated would store data on disk and build an index that allows efficient random access.
/api/queryasyncdata which returns a subset of results.
Args: query_id, start, count. Result:
We can't send all results to the browser at once. Too much data overwhelms the browser.
We need Additional apis:
/api/queryasync
which starts the query and returns query id. Args: query string. Result:Optional args:
max_rows
to set a limit on number of rows,max_data_size
to set a limit on size of data./api/queryasyncstatus
which asks for the status of the query given id. Args: query id. Result:Front-end will ask for status until
finished
is true. At first we can just buffer data in memory. More sophisticated would store data on disk and build an index that allows efficient random access./api/queryasyncdata
which returns a subset of results. Args:query_id
,start
,count
. Result: