Open fpeter8 opened 6 years ago
table is updated regularly for running jobs (you cannot calculate the cost once, you have to subscribe to the changes / update regularly)
json is obsolete, you have to use this api: https://cloud.google.com/billing/v1/how-tos/catalog-api
usage:
Relevant entries:
each region might have different prices, the API returns the regions in the "serviceRegions" field. Dataflow region is in the table below the "Resource metrics" table.
For "Current" fields the value is {price} * {current value} / {time period} e.g. "Current vCPU" = 1 "vCPU Time Streaming" = 0.00054 USD / hour calculated value: 0.00054 USD / hour
For "Total" fields the value is {price} * {total value} e.g. "Total vCPU" = 5.14 vCPU h "vCPU Time Streaming" = 0.00054 USD / hour calculated value: 0,0027756 USD
For development, you can use API key 'AIzaSyBZVfVwDKpduSuNOJlvWildIeQ5AsNtnWM' (not sure if we should use this for prod, or get it from the user via config)
✅ Currency code can be specified. ❓ pricingInfo array is a chronological ordering of price history.
https://cloud.google.com/billing/reference/rest/v1/services.skus/list
The cost metrics should be displayed in a user preferred currency format. The user should be able to define what is his preferreed currency, and should be able to define project specific currency format as well.
I think the project specific configuration is over-engineering. I believe a global GCPimp level configuration is more than enough
Description of the SKU api response: https://cloud.google.com/billing/reference/rest/v1/services.skus/list#PricingExpression
Dataflow job detail page (https://console.cloud.google.com/dataflow?project=projectId -> go to any job), on the right hand panel, under "Resource metrics", add two new rows:
Note that the page looks different for batch/streaming jobs (and possibli for SDK1/2 jobs as well)