Closed nevillelyh closed 8 years ago
Spark clusters created with DataProc or bdutil should have GCS connector and GCP credentials already setup. A user should be able to just load the package and read/write BQ tables.
Verified that it works for Dataproc. Didn't test bdutil since it seems deprecated.
Spark clusters created with DataProc or bdutil should have GCS connector and GCP credentials already setup. A user should be able to just load the package and read/write BQ tables.