Closed StevenMapes closed 6 months ago
Aren't they using s3 APIs? 🤔
Aren't they using s3 APIs? thinking
That is true. I'm still waiting for a response to my access request so I could test it myself, was hoping someone may have already gotten access here and would be able to confirm if it does work out the box or if there are any tweaks that are required
@StevenMapes You end up testing it?
Hi @StevenMapes,
I tested and it's working using S3Boto3Storage
for private media files.
Since, buckets are not public as of now didn't test the static files part/public media part.
I had written a blog regarding it at https://djangotherightway.com/using-cloudflare-r2-with-django-for-storage
We're successfully serving Django static files from an R2 bucket on a custom domain by attaching a CF worker to it.
Hi @StevenMapes, I tested and it's working using
S3Boto3Storage
for private media files. Since, buckets are not public as of now didn't test the static files part/public media part.I had written a blog regarding it at https://djangotherightway.com/using-cloudflare-r2-with-django-for-storage
Thanks for the little tutorial. Unfortunately it doesn't seem to be compatible with R2's custom domains feature, which is the only way to use their caching. If you set AWS_S3_ENDPOINT_URL
to the custom domain, uploads don't work, but rendering (with caching) does. And if you use AWS_S3_CUSTOM_DOMAIN
, then it doesn't generate signed URLs (so uploading works and the rendering doesn't).
It's annoying, cause it's so close to being there. If you could use AWS_S3_ENDPOINT_URL
for uploading but a different URL for serving the signed images it would be fine. If there's a way I've overlooked, please let me know.
@djch Any luck finding a way to solve this? I myself have the same problem, I think, but I'm no good at media file handling or well DevOps in general.
My files get uploaded, but I can't retrieve it unless I allow public access to the bucket and use the public bucket url, but the same does not work if using a custom domain for the public url. In private mode, S3 API uploads, but does not load the file. I didn't even look into caching yet but I will need that too.
I guess R2 is not viable to use quite yet.
@mikhail-skorikov for the time being I have implemented the same solution as @timkofu and deployed a CF Worker called render to proxy requests in front of the R2 bucket on a custom domain. That seems like a viable workaround until django-storages has better (or native) support for R2 storage.
Would any of you want to share all the configs / code that you've put together to make this work? It's a shame that this doesn't work natively.
It now works as expected; create bucket, chose a region, attach a subdomain, set up CORS and voila!
Would any of you want to share all the configs / code that you've put together to make this work?
STORAGES = {"staticfiles": {"BACKEND": "storages.backends.s3boto3.S3StaticStorage"}}
AWS_STORAGE_BUCKET_NAME = "bucket_name"
AWS_LOCATION = "a_folder_inside_the_bucket"
AWS_S3_ACCESS_KEY_ID = "r2_key"
AWS_S3_SECRET_ACCESS_KEY = "r2_secret"
AWS_S3_CUSTOM_DOMAIN = "things.example.com"
AWS_S3_ENDPOINT_URL = (
"https://s3_api.url.from.r2_bucket.settings.page/" # Yes; without the appended bucket name.
)
Is URL signing also working?
@timkofu I tried this and while the command does output a link, the link itself doesn't work. When I try to access the URL I get this error:
<Error>
<Code>InvalidArgument</Code>
<Message>
Invalid Argument: Credential access key has length 20, should be 32
</Message>
</Error>
Does it work for you? If "yes", are you using a paid cloudflare plan?
edit: sorry, wrong repo!
I made some research. Basically, this won't work for custom domains. Custom domains must use HMAC validation ( https://developers.cloudflare.com/ruleset-engine/rules-language/functions/#hmac-validation ).
@timkofu I tried this and while the command does output a link, the link itself doesn't work. When I try to access the URL I get this error:
<Error> <Code>InvalidArgument</Code> <Message> Invalid Argument: Credential access key has length 20, should be 32 </Message> </Error>
Does it work for you? If "yes", are you using a paid cloudflare plan?
I've encountered and resolved this issue. In my case this issue is because we were using the old aws credentials. I would say you try to override the storage and inspect what the access key print.
import boto3
from storages.backends.s3boto3 import S3Boto3Storage
from storages.utils import setting
class R2Storage(S3Boto3Storage):
def _create_session(self):
print(f"access_key: {self.access_key}")
print(f"secret_key: {self.secret_key}")
if self.session_profile:
session = boto3.Session(profile_name=self.session_profile)
else:
session = boto3.Session(
aws_access_key_id=self.access_key,
aws_secret_access_key=self.secret_key,
aws_session_token=self.security_token
)
return session
I don't think it's due to the custom domain thingy. We're using custom domain with no issue and I've managed to get it fully work for R2.
In our case, we inherited the S3Boto3Storage class and override the default settings accordingly.
It now works as expected; create bucket, chose a region, attach a subdomain, set up CORS and voila!
Would any of you want to share all the configs / code that you've put together to make this work?
STORAGES = {"staticfiles": {"BACKEND": "storages.backends.s3boto3.S3StaticStorage"}} AWS_STORAGE_BUCKET_NAME = "bucket_name" AWS_LOCATION = "a_folder_inside_the_bucket" AWS_S3_ACCESS_KEY_ID = "r2_key" AWS_S3_SECRET_ACCESS_KEY = "r2_secret" AWS_S3_CUSTOM_DOMAIN = "things.example.com" AWS_S3_ENDPOINT_URL = ( "https://s3_api.url.from.r2_bucket.settings.page/" # Yes; without the appended bucket name. )
This worked
There is an authorization issue related to relative paths in CSS files. Similar problem were encountered in AWS https://github.com/jschneier/django-storages/issues/734 which was resolved by @dennisvang by whitelisting the files in the bucket policy, but no such solution for Cloudflare r2,Anybody with similar issues or solution?
There is an authorization issue related to relative paths in CSS files. Similar problem were encountered in AWS #734 which was resolved by @dennisvang by whitelisting the files in the bucket policy, but no such solution for Cloudflare r2,Anybody with similar issues or solution?
This is a pretty lame workaround, but I ended up using the S3Client
in nodejs instead, just needed a short script:
const { S3Client } = require("@aws-sdk/client-s3");
Docs added in #1378
With the announcement of Cloudflare R2 it would be great if we could add in a backend to support that