Open canassa opened 9 years ago
Yeah agree. I'm really interested in getting localshop running via docker on AWS with S3 as backend. So that would be great.
I've add the following to my config which makes it work:
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = 'my-access-key'
AWS_SECRET_ACCESS_KEY = 'my-secret-key'
AWS_STORAGE_BUCKET_NAME = 'my-bucker'
We should set these with the values.Value()
so that they can be set by ENV vars
@mvantellingen I am trying to setup localshop for our local pypi server and got it working with default local storage. I could not get it working with S3 backend. I tried your above comment and got those 4 properties in the ~/.localshop/localshop.conf.py file. Nothing seems to change the behavior. Thoughts on how to debug this?
@mvantellingen @canassa Just figured the above changes are only in the develop branch. I was running the latest release 0.9.3
which does not have any of these settings. This is the last release (May 15 2015). Is there going to be any other release soon?
This issue is invalid because django-storages-redux now is django-storages on pypi (February of 2016).
See history section of https://pypi.python.org/pypi/django-storages/1.5.1
Currently, Localshop supports custom storage using the django-storages project. The problem is that the django-storages project is currently dead. There were no release since 2013-03-31 and the last commit on the bitbucket repository was on March 2014. The documentation on how to use a custom storage backend is not very good and needs an overall.
django-storages
withdjango-storages-redux