Logflare / logflare

Never get surprised by a logging bill again. Centralized structured logging for Cloudflare, Vercel, Elixir and Javascript.
https://logflare.app
Apache License 2.0
835 stars 45 forks source link

Rate Limit Error when trying to create BigQuery Tables #2240

Open srasul opened 3 weeks ago

srasul commented 3 weeks ago

When starting a new instance of logflare with Supabase self-hosted, we are seeing this error message during the startup of analytics:

supabase-analytics  | 12:45:53.801 [error] BigQuery dataset create error: 1_default: Exceeded rate limits: too many dataset metadata update operations for this dataset. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas
supabase-analytics  | 
supabase-analytics  | 12:45:53.805 [error] BigQuery dataset create error: 1_default: Exceeded rate limits: too many dataset metadata update operations for this dataset. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas
supabase-analytics  | 
supabase-analytics  | 12:45:53.808 [error] BigQuery dataset create error: 1_default: Exceeded rate limits: too many dataset metadata update operations for this dataset. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas
supabase-analytics  | 
supabase-analytics  | 12:45:53.828 [error] BigQuery dataset create error: 1_default: Exceeded rate limits: too many dataset metadata update operations for this dataset. For more information, see https://cloud.google.com/bigquery/docs/troubleshoot-quotas

After looking at code, it seems at this stage we are trying to create the BQ tables. And looking at the link in the error message, it seems that there is a rate-limit of 5 operations per 10 seconds:

image

Is there a way to add a small delay for these operations so that we don't hit a rate-limit?

We are using Logflare v1.8.2

srasul commented 3 weeks ago

according to:

https://cloud.google.com/bigquery/docs/troubleshoot-quotas#ts-maximum-update-table-metadata-limit

It seems this limit cannot be updated so one of the solutions is to add a delay between these operations so as not to hit this rate-limit