-
My company uses s5cmds to speed up uploads to GCS using the S3 interoperability feature. Is S3 interoperability supported or planned to be supported?
-
for uploading large files to GCS, it's recommended to use resumable uploads.
If there's interest I'd be down to contribute this upstream to this crate.
More details about how to do it here: htt…
-
Simply attempting to call `bucket.initiate_multipart_upload('/testkey')` raises a 403 on GCS.
A cursory look suggest that GS's bucket is inheriting the method when it needs to be overridden. GCS multi…
-
C:/Users/wh/speech/google_speech_to_text/video\TheSimpsons.mp4
TheSimpsons.ogg
Traceback (most recent call last):
File "auto_run.py", line 23, in
auto_run(directory)
File "auto_run.py", …
-
Hi Team,
I am using the cloud function to upload the file to GCS, I am finding latency to upload the file due to so many resumable API calls being made, the file size is approx between 100-300 MB, …
-
How to send headers with metadata to disable the default 1 hour caching of uploaded files to GCS?
**Standard Example**
'Content-Type' => 'text/json',
'acl' => 'public-read',
'enabl…
-
@shcheklein in that case our concurrency level will be `jobs * jobs`, which is generally going to be way too high in the default case. I also considered splitting `jobs` between the two (so `batch_s…
-
**What happened?**
When the medusa-restore init container attempts a restore from a GCS backup created using multipart uploads, timeouts and ultimately failure occur. This is likely due to the medusa …
-
We're trying to upload ~2.5k small files, and currently this takes around 8 minutes, even though we're on a gigabit connection (and GCS being quite fast).
It would be great to get some sort of bulk…
-
Is there a way to set expiration headers to chunks in GCS store?, or someone knows how to deal with unfinished uploads?