A production-ready BigQuery DataSource plugin that provides support for BigQuery as a backend database.
There are multiple ways to install bigquery-grafana. See INSTALL for more information.
tablename_YYYYMMDD
)Plugin Demo:
Dashboards
you should find a link named Data Sources
.+ Add data source
button in the top header.BigQuery
from the Type dropdown.NOTE: If you're not seeing the
Data Sources
link in your side menu it means that your current user does not have theAdmin
role for the current organization.
Name | Description |
---|---|
Name | The datasource name. This is how you refer to the datasource in panels & queries. |
Default | Default datasource means that it will be pre-selected for new panels. |
Service Account Key | Service Account Key File for a GCP Project. Instructions below on how to create it. |
You can now set query priority "INTERACTIVE" or "BATCH" per datasouce
You can manage DataSource via provisioning system. See the example below of a configuration file.
apiVersion: 1
datasources:
- name: <Datasource Name>
type: doitintl-bigquery-datasource
access: proxy
isDefault: true
jsonData:
authenticationType: jwt
clientEmail: <Service Account Email>
defaultProject: <Default Project Name>
tokenUri: https://oauth2.googleapis.com/token
secureJsonData:
privateKey: |
-----BEGIN PRIVATE KEY-----
<Content of the Private Key>
-----END PRIVATE KEY-----
version: 2
readOnly: false
There are two ways to authenticate the BigQuery plugin - either by uploading a Google JWT file, or by automatically retrieving credentials from Google's metadata server. The latter is only available when running Grafana on a GCE virtual machine.
To authenticate with the BigQuery API, you need to create a Google Cloud Platform (GCP) Service Account for the Project you want to show data for. A Grafana datasource integrates with one GCP Project. If you want to visualize data from multiple GCP Projects then you can give the service account permissions in each project or create one datasource per GCP Project.
Go to BigQuery API and Enable
the API:
Navigate to the APIs & Services Credentials page.
Click on Create credentials
and choose Service account key
.
On the Create service account key
page, choose key type JSON
. Then in the Service Account
dropdown, choose the New service account
option:
Some new fields will appear. Fill in a name for the service account in the Service account name
field and then choose the BigQuery Data Viewer
and BigQuery Job User
roles from the Role
dropdown:
Click the Create
button. A JSON key file will be created and downloaded to your computer. Store this file in a secure place as it allows access to your BigQuery data.
Upload it to Grafana on the datasource Configuration page. You can either upload the file or paste in the contents of the file.
The file contents will be encrypted and saved in the Grafana database. Don't forget to save after uploading the file!
If Grafana is running on a Google Compute Engine (GCE) virtual machine, it is possible for Grafana to automatically retrieve default credentials from the metadata server. This has the advantage of not needing to generate a private key file for the service account and also not having to upload the file to Grafana. However for this to work, there are a few preconditions that need to be met.
BigQuery API
scope. See instructions here.Read more about creating and enabling service accounts for GCE VM instances here.
The query builder provides a simple yet a user-friendly interface to help you quickly compose a query. The builder enables you to define the basic parts of your query, The common ones are:
WHERE clause - Either use one of the pre-defined macros, to speed your writing time, or set up your own expression. Existing supported Macros are:
a. Macro $__timeFiler with last 7 days example:
WHERE `createDate` BETWEEN TIMESTAMP_MILLIS (1592147699012) AND TIMESTAMP_MILLIS (1592752499012) AND _PARTITIONTIME >= '2020-06-14 18:14:59' AND _PARTITIONTIME < '2020-06-21 18:14:59'
b. Macro $__timeFrom with last 7 days example:
WHERE `createDate` > TIMESTAMP_MILLIS (1592223758609) AND _PARTITIONTIME >= '2020-06-15 15:22:38' AND _PARTITIONTIME < '2020-06-22 15:22:38'
c. Macro $__timeTo with last 7 days example:
WHERE `createDate` < TIMESTAMP_MILLIS (1592828659681) AND _PARTITIONTIME >= '2020-06-15 15:24:19' AND _PARTITIONTIME < '2020-06-22 15:24:19'
You can now use timeFilter macro in raw sql mode
Note: If your processing location is not the Default US one set your location from the processing Location drop-down at the top right bottom of the query builder
Viewing your Query
The build works with Yarn:
yarn run build:dev
yarn run build:prod
Tests can be run with Jest:
yarn run test
See the CONTRIBUTING file.
See the LICENSE file.