-
Step 9 of the codelab requires you to, Select file from GCS bucket field, enter ‘gs://firebase-recommendations/recommendations-test/formatted_data_filtered.txt'. In order to Create table from select ‘…
-
## Steps to reproduce
1. Clone the project
2. Run the backups integration test: `tox -e integration -- tests/integration/test_backups.py::test_backup_aws --keep-models`
3. Encounter `botocore…
-
The backup failure is because GCS could not handle the number of request at that specific point in time. This is because if the rate-limiting on the GCS side. But, in this particular scenario, there c…
-
Hi, I'm trying to run Parseable on my GCP cloud, but I am not able to connect with my GCS bucket. Getting the following error -
This is my `env` file -
```
P_STAGING_DIR=/staging
P_ADDR=0.0.…
-
When a new GCS backend bucket is created it defaults to the US multi-region. This behaviour should be configurable so that the backend state bucket can be created in a desired location.
Perhaps nee…
-
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the…
-
**Summary**
This is quite an exciting feature.
With this feature implemented, the databend can migrate/backup data without 3rd tools which makes the databend no storage vendor locking.
```sql…
-
### Is this a possible security vulnerability?
- [X] This is NOT a possible security vulnerability
### Describe the bug
When dropping a table, the data folder is deleted but the metadata folder rem…
-
Note: this is not a blocker as we can psalm-suppress the error and the code actually works fine, but I'm reporting this here since it seems like an easy fix and others might encounter the same issue.
…
-
We currently do something like this in our recording rules:
```
# get usage metric per bucket
max by (bucket_name, location) (
last_over_time((stackdriver_gcs_bucket_storage_googleap…