We currently index epoch reward data and it gets written to BigQuery. Epoch rewards cannot readily be analysed in dune because voting rewards do not emit an event. Computing voting rewards instead require access to an Archive node.
We provide analytics on this data via the explorer stats page:
Downside to explorer stats
The explorer stats page is cumbersome to maintain requiring (1) GCP Cloud functions to fetch & write data to BigQuery, (2) a data replication pipeline from the Blockscout Postgres db to BigQuery, (3) dbt models to create tables from the raw replicated tables, (4) Google Looker studio to make dashboards, (5) iframes to display the stats on the explorer.
Proposed solution
Ideally, most community analytics are provided on dune.com. Dune provides the benefit that the community can fork, edit, and improve SQL queries with ease. It also provides indexed data (which can replace the need for the data replication pipeline and BigQuery instance), SQL interface (which can replace the need for dbt models), and dashboards (which can replace Looker studio).
Context
Dune now allows users to upload data in .csv files programmatically via their API (see docs: Uploading data).
Design considerations
Upload method
[x] Start testing with this Google Colab notebook.
Better design might involve using a GCP cloud function (this is how USD/CELO FX data is currently fetched from coingecko and written to BigQuery)
Upload frequency
An epoch is ~24h, so at most data will have to be uploaded once per day.
File size
There is a limit of 200MB for data uploads.
[x] Check what file size exporting all epoch rewards from BigQuery would be
Updating data
Currently there is no way to update the data of an already existing .csv file. However, you can simply make a request to the API with the same table_name and replace the already existing file that way.
Motivation
We currently index epoch reward data and it gets written to BigQuery. Epoch rewards cannot readily be analysed in dune because voting rewards do not emit an event. Computing voting rewards instead require access to an Archive node.
We provide analytics on this data via the explorer stats page:
Downside to explorer stats
The explorer stats page is cumbersome to maintain requiring (1) GCP Cloud functions to fetch & write data to BigQuery, (2) a data replication pipeline from the Blockscout Postgres db to BigQuery, (3) dbt models to create tables from the raw replicated tables, (4) Google Looker studio to make dashboards, (5) iframes to display the stats on the explorer.
Proposed solution
Ideally, most community analytics are provided on dune.com. Dune provides the benefit that the community can fork, edit, and improve SQL queries with ease. It also provides indexed data (which can replace the need for the data replication pipeline and BigQuery instance), SQL interface (which can replace the need for dbt models), and dashboards (which can replace Looker studio).
Context
Dune now allows users to upload data in
.csv
files programmatically via their API (see docs: Uploading data).Design considerations
Upload method
Better design might involve using a GCP cloud function (this is how USD/CELO FX data is currently fetched from coingecko and written to BigQuery)
Upload frequency
An epoch is ~24h, so at most data will have to be uploaded once per day.
File size
There is a limit of 200MB for data uploads.
Updating data
Source: Updating data