grafana / google-bigquery-datasource

Google BigQuery Datasource Plugin for Grafana.
Apache License 2.0
26 stars 13 forks source link

better handle large results (do not allow it to happen) #235

Open gabor opened 8 months ago

gabor commented 8 months ago

when the result contains too many rows, it's hard to guarantee good behavior/performance. we should investigate the possibility to have a hardcoded limit. let's say 5000 rows. if the response contains more than 5000roews, we return an error with a nice error message explaining the situation.

or we could have this limit in the datasource-config-json maybe.

shih-chris commented 8 months ago

This is not often the case, but there are instances where I've used a grafana transform to group + aggregate data that is returned by the bigquery data source.

That said, I totally understand that there are limitations to what can be handled/displayed while maintaining performance.

I suppose my main concern would be something "failing" silently, where a transformed result starts to provide incorrect stats due to the truncated results. Perhaps this is something we can solve by providing some warning when the result is truncated?

gabor commented 8 months ago

hi @shih-chris , thanks for describing your use-case, i see that some situations want to process a lot of data. it's tough to balance, because on the other "side", you do not want your components to potentially consume enormous amounts of resources (memory/cpu)... still, i see your point 👍

regarding the failing-silently problem.., yes, i agree. we should not just truncate the results silently. IF (that's a big IF) we go the way of limiting the result-sizes, for sure we'll produce a good error-message.