Closed Azathothas closed 1 month ago
Fixed it by patching the code itself
Hi @kapitanluffy , thank you for sending a fix. I had forgotten that I had even created this issue. Unfortunately, I went with another provider, so I can't test if this works.
Hi @kapitanluffy , thank you for sending a fix. I had forgotten that I had even created this issue. Unfortunately, I went with another provider, so I can't test if this works.
No worries. I ran into a similar issue so I thought I shouldnshare it here if someone sees this.
The issue here is that as explained in the docs, rclone assumes that it needs to prefix the path with file/
, but this worker is using the S3-compatible API to access B2, which doesn't have that file/
prefix.
I'll look at fixing this by introducing a new environment variable.
My
wrangler.toml
My Bucket Configuration
My
rclone.conf
Downloading
https://b2.my-domain.com/my-file
Works ✅Using rclone
copy|copyto
Fails ❌Failed to copyto: failed to HEAD for download: Unknown 403 Forbidden (403 unknown)
Same error if I try copying from bucket to my local system
❯ rclone copyto "b2:/my-bucket/my-file" "$LOCAL_FILESYSTEM/my-file"
Failed to copyto: failed to HEAD for download: Unknown 403 Forbidden (403 unknown)
rclone mount
Possible Cause
As per https://rclone.org/b2/#b2-download-url
May also be related:
@sshockwave filed a similar issue: https://forum.rclone.org/t/backblaze-b2-with-cloudflare-worker-conflicts-on-the-authorization/21353/3
https://github.com/rclone/rclone/pull/4896
As soon as I remove
download_url = https://b2.my-domain.com
fromrclone.conf
, everything works as expected, but then there would be no point in using BackBlaze at all since egress cost alone would be too expensive for me.