Closed albertog closed 6 years ago
what you hae there and the part below should work locally and the same remotely if using getServiceAccount()
reads the json file....
but is there a specific reason you want to use a json cert file within the cloud fn?
(GCF has its own service account identity which you can use by default)
package main
import (
"io/ioutil"
"log"
"cloud.google.com/go/bigquery"
"golang.org/x/net/context"
"golang.org/x/oauth2/google"
"google.golang.org/api/iterator"
"google.golang.org/api/option"
)
func main() {
serviceAccountJSONFile := "json_cert_file.json"
dat, err := ioutil.ReadFile(serviceAccountJSONFile)
if err != nil {
log.Fatalf("Unable to read service account file %v", err)
}
conf, err := google.JWTConfigFromJSON(dat, bigquery.Scope)
if err != nil {
log.Fatalf("Unable to acquire generate config: %v", err)
}
src := conf.TokenSource(context.Background())
ctx := context.Background()
client, err := bigquery.NewClient(ctx, "project2",option.WithTokenSource(src))
if err != nil {
log.Fatalf("Unable to acquire storage Client: %v", err)
}
ts := client.Dataset("dataset1").Tables(ctx)
for {
t, err := ts.Next()
if err == iterator.Done {
break
}
if err != nil {
log.Fatalf("Unable to acquire storage Client: %v", err)
}
log.Printf("Table: %q\n", t.TableID)
}
}
$ go run src/main.go
2017/09/05 10:51:52 Unable to acquire storage Client: googleapi: Error 403: Access Denied: Dataset project2:table1: The user somesvc_acct@project.iam.gserviceaccount.com does not have bigquery.tables.list permission for dataset project2:table1., accessDenied
exit status 1
then add Bigquery Data viewer role to the remote project2 for this service account
$ go run src/main.go
2017/09/05 10:52:37 Table: "customers"
You should also be able to get credentials from the metadata service: https://cloud.google.com/compute/docs/storing-retrieving-metadata
You don't have to call the metadata service explicitly either....the default clients would pick up creds from GCF's metadata server.
see
@salrashid123, metadata.google.internal
is GCF's metadata server.
yeah, what i was stating by that is you don't have to explicitly interrogate the metadata server to get a token (ADC will go down the list to seek it out for you automatically)
thanks I have re-code following your links and it works perfectly paste my code to share with others. I my opinion al realted with permission, credentials etc is very very powerfull but need more examples real howto. When you see the end solution is easy the dificulty is found it.
ctx := context.TODO()
ts, err := google.DefaultTokenSource(ctx, bigquery.Scope)
if err != nil {
// Handle error.
}
client, err := bigquery.NewClient(ctx, "xxxx", option.WithTokenSource(ts))
if err != nil {
log.Fatalf("Unable get Client: %v", err)
}
q := client.Query(QUERY)
I have tried to performance a query from one project A to a BigQuery in other project B and I am getting a error connection. (i am using a service-account)
Same app running locally works propertly. I have spent a lot time with permision, service account, token, credentials etc, etc.
Someone can confirm me that current version on c-f-go support cross projects environments.
thanks
ps. apoint to example will be perfect.