Open rmartz opened 8 years ago
Subclassing S3BotoStorage
may help with this issue, and potentially with other issues involved with keeping storage backend instances separate:
class PublicS3BotoStorage(S3BotoStorage):
"""
Override of S3BotoStorage that strips authentication parameters for use
with public buckets
This is needed because of a boto issue:
https://github.com/boto/boto/issues/1477
When the issue is fixed, set the AWS_QUERYSTRING_AUTH setting to false and
remove this workaround.
This class was borrowed from: https://github.com/boto/boto/issues/1477#issuecomment-38759048 # NOQA
"""
def __init__(self, *a, **k):
kwargs = dict(querystring_auth=False)
# merge in any arguments that were passed
kwargs.update(k)
super(PublicS3BotoStorage, self).__init__(*a, **kwargs)
def url(self, name):
orig = super(PublicS3BotoStorage, self).url(name)
scheme, netloc, path, params, query, fragment = urlparse.urlparse(orig)
params = urlparse.parse_qs(query)
if 'x-amz-security-token' in params:
del params['x-amz-security-token']
query = urllib.urlencode(params)
return urlparse.urlunparse((scheme, netloc, path, params, query,
fragment))
Note: We've used this technique in production with success, but the code here was interacting with a previous version of the storages
backend. Hopefully most of it still applies.
Newer versions of storages
now have an option to strip the signing from the URL.
When loading a page using S3 cached files, the URLs used are suffixed by GET parameters, which cause them to bypass S3's cache mechanisms:
This is caused by Boto's signed URLs, which are necessary to load private files. However, they are not necessary for public files and can be disabled either globally with
AWS_QUERYSTRING_AUTH = False
or within a specific Boto instance byS3BotoStorage(bucket="foo", querystring_auth=False)
(See http://stackoverflow.com/a/16818992).We should determine which is appropriate for our uses and update our configuration appropriately.