We are using Supabase Self-hosted with BigQuery as the 'sink' for our logs. When trying to fetch the logs in Supabase Studio, we get the following error:
{
"code": 400,
"errors": [
{
"domain": "global",
"location": "q",
"locationType": "parameter",
"message": "Name metadata not found inside t at [1:1060]",
"reason": "invalidQuery"
}
],
"message": "Name metadata not found inside t at [1:1060]",
"status": "INVALID_ARGUMENT"
}
Analyzing the query that is sent to BQ, we see that in the query we are selecting t.metadata from tables.
But when I look at the data in some of the tables, I notice we have no data in these tables, so the only columns in the tables are:
timestamp
id
event_message
So therefore, we are trying to query for a column that does not exist. And it see that even though this big-query SQL has the SQL to query for all the tables, only one table is actually queried. But since we send the SQL for all tables, this means that the SQL has to be valid for all the tables.
Can you advise how we can proceed with this? or steps we can take to get these queries in BQ to run without any errors?
We are using Supabase Self-hosted with BigQuery as the 'sink' for our logs. When trying to fetch the logs in Supabase Studio, we get the following error:
Analyzing the query that is sent to BQ, we see that in the query we are selecting
t.metadata
from tables.But when I look at the data in some of the tables, I notice we have no data in these tables, so the only columns in the tables are:
timestamp
id
event_message
So therefore, we are trying to query for a column that does not exist. And it see that even though this big-query SQL has the SQL to query for all the tables, only one table is actually queried. But since we send the SQL for all tables, this means that the SQL has to be valid for all the tables.
Can you advise how we can proceed with this? or steps we can take to get these queries in BQ to run without any errors?