Closed cgoodfred closed 3 years ago
Merging #320 (9806ad5) into master (62b3327) will decrease coverage by
0.1%
. The diff coverage is88.6%
.
@@ Coverage Diff @@
## master #320 +/- ##
========================================
- Coverage 93.1% 93.0% -0.1%
========================================
Files 45 45
Lines 3009 3289 +280
Branches 405 450 +45
========================================
+ Hits 2802 3059 +257
- Misses 112 126 +14
- Partials 95 104 +9
This PR allows nise data to be uploaded to a GCP bucket and then create a BigQuery table from that bucket so that the table can be downloaded by koku for processing.
There are a few steps that need to happen before testing this, you need a gcp account with an existing storage bucket and credentials for that gcp account, the credentials need to be added to your environment looking something like:
GOOGLE_APPLICATION_CREDENTIALS='path/to/creds'
The credentials you use also must have a few permissions set in order to work, they are:
storage.buckets.get
storage.objects.create
storage.objects.delete
bigquery.datasets.delete
bigquery.datasets.create
bigquery.jobs.create
bigquery.tables.delete
There are a few new arguments for gcp added that help this work
--gcp-dataset-name
, this specifies the name of the bigquery dataset to use and is required--gcp-table-name
, this specifies the name of the table to use, this is optional and defaults togcp_billing_export_<etag>
if not suppliedTo test without specifying a static file:
nise report gcp --gcp-bucket-name my-bucket-name --gcp-dataset-name my_dataset_name --gcp-etag nise_test -s 2021-02-01
nise_test/202102_nise_test_2021-02-01:2021-02-11.json
gcp_billing_export_nise_test
from the example above{ "name": "Test GCP Source", "source_type": "GCP", "authentication": { "credentials": { "project_id": "<your_project_id_here>" } }, "billing_source": { "data_source": {"dataset": <dataset_name>"} } }
http://localhost:8000/api/cost-management/v1/reports/gcp/costs/
and see that there is now information available.The only difference when specifying a static file is to include that in the nise call, for example something like:
nise report gcp --static-report-file my-test-file.yaml --gcp-bucket-name my-bucket-name --gcp-dataset-name my_dataset_name --gcp-etag nise_test
The steps following should remain the same, here is an example yaml file (remove the .txt as github doesn't allow you to upload a yaml directly): example_gcp_static_data.yml.txt