-
Example code:
```
vertexai.init(project=PROJECT_ID, location=LOCATION)
model: ImageGenerationModel = ImageGenerationModel.from_pretrained(model_name=MODEL_NAME)
Common.log_info(message='Handli…
-
### Terraform Version
```shell
Terraform v1.5.2
on darwin_amd64
+ provider registry.terraform.io/hashicorp/archive v2.4.0
+ provider registry.terraform.io/hashicorp/google v4.69.1
+ provider r…
-
Trying to read a spark-generated hive-style partitioned parquet dataset with **gcsfs** and {**}pyarrow{**}, but getting a **FileNotFoundError** if I try to read from the base directory or even if try …
-
@jbusecke, I tried to figure out how to tell what region a GCS bucket is in for 15 minutes and failed.
* Do you know how to tell what region a bucket is in?
* Is `gs://cmip6` bucket in `us-centra…
-
Hi,
I tried to deploy the **loki-simple-scalable** chart to store my logs in a GCS bucket. I use this values file :
```yaml
loki:
commonConfig:
storage:
filesystem: null
g…
-
hi, when i use gcs like tuttorial, i get an error "Disk [gcs] does not have a configured driver.". I use laravel 7.24 and laravel-google-cloud-storage 2.2.3.
filesystem.php
`'gcs' => [
…
-
### Describe the issue
Execute:
```bash
./run kestra:cli 1 flow validate kestra_flows/automations/bigquery --server=https://us.kestra.cloud --api-token= --tenant=
```
Result:
```bash
✘ …
-
We have **Avro** files stored in GCS bucket and I have created a table in BigQuery using source partitioning that queries data from that **GCS** bucket.
We also have a PySpark job runs every hour, w…
-
https://github.com/ClickHouse/ClickHouse/blob/32b765a4ba577acbfdb09a8d400dad8d4ef0f48d/src/TableFunctions/TableFunctionS3.cpp#L338C1-L348C3
We have a need to choose region in GCS bucket, default `u…
-
our spark is a long running spark session, but we need to access different bucket during the life span of the spark session, and our permission setup require different credentials for different bucket…